Source:
Source: Dr.P.Soundarapandian.M.D.,D.M (Senior Consultant Nephrologist), Apollo Hospitals, Managiri, Madurai Main Road, Karaikudi, Tamilnadu, India.
Creator: L.Jerlin Rubini(Research Scholar) Alagappa University, EmailId :jel.jerlin '@' gmail.com ContactNo :+91-9597231281
Guided by: Dr.P.Eswaran Assistant Professor, Department of Computer Science and Engineering, Alagappa University, Karaikudi, Tamilnadu, India. Emailid:eswaranperumal '@' gmail.com
Load the Data
Overview of the Data
Data Preparation
Exploratory Data Analysis
Model Building
Improve Model
The notebook is designed in such a way that you just need to plug in the input values given below and run the code. It will run on it's own and will build the model as well.
import warnings
from warnings import filterwarnings
filterwarnings('ignore')
# Input file name with path
input_file_name = 'kidney_disease.csv'
# Target class name
input_target_class = "class"
# Columns to be removed
input_drop_col = "id"
# Col datatype selection
input_datatype_selection = 'auto' # use auto if you don't want to provide column names by data type else use 'manual'
# Categorical columns
input_cat_columns = [
'red blood cells', 'pus cell', 'pus cell clumps', 'bacteria',
'packed cell volume',
'white blood cell count', 'red blood cell count', 'ypertension',
'diabetes mellitus', 'coronary artery disease', 'appetite',
'pedal edema', 'anemia', 'class']
# Numerical columns
input_num_columns = ['id', 'age', 'blood pressure', 'specific gravity', 'albumin', 'sugar','blood glucose random', 'blood urea', 'serum creatinine', 'sodium',
'potassium', 'haemoglobin']
# Encoding technique
input_encoding = 'LabelEncoder' # choose the encoding technique from 'LabelEncoder', 'OneHotEncoder', 'OrdinalEncoder' and 'FrequencyEncoder'
# Handle missing value
input_treat_missing_value = 'impute' # choose how to handle missing values from 'drop','inpute' and 'ignore'
In this section you will:
Import all the libraries in the first cell itself
from pyforest import *
# Import libraries
# Data Manipulation
import numpy as np
import pandas as pd
from pandas import DataFrame
# Data Visualization
import seaborn as sns
import matplotlib.pyplot as plt
# Machine Learning
from sklearn.preprocessing import LabelEncoder, StandardScaler, OrdinalEncoder
from sklearn.impute import SimpleImputer
from sklearn.model_selection import train_test_split, GridSearchCV
from sklearn.metrics import confusion_matrix , classification_report, accuracy_score, roc_auc_score, plot_roc_curve, plot_confusion_matrix
from sklearn.linear_model import LogisticRegression
from sklearn.tree import DecisionTreeClassifier
from sklearn.ensemble import RandomForestClassifier
from sklearn.neighbors import KNeighborsClassifier
from sklearn.discriminant_analysis import LinearDiscriminantAnalysis
from sklearn.naive_bayes import GaussianNB
from xgboost import XGBClassifier
from lightgbm import LGBMClassifier
from imblearn.over_sampling import RandomOverSampler
import pickle
from sklearn.feature_selection import SelectKBest#Also known as Information Gain
from sklearn.feature_selection import chi2
from sklearn.model_selection import RandomizedSearchCV
# Maths
import math
# Set the options
pd.set_option('display.max_rows', 800)
pd.set_option('display.max_columns', 500)
%matplotlib inline
Load the dataset using pd.read_csv()
# Read data in form of a csv file
df = pd.read_csv(input_file_name)
# First 5 rows of the dataset
df.head()
| id | age | bp | sg | al | su | rbc | pc | pcc | ba | bgr | bu | sc | sod | pot | hemo | pcv | wc | rc | htn | dm | cad | appet | pe | ane | classification | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 0 | 48.0 | 80.0 | 1.020 | 1.0 | 0.0 | NaN | normal | notpresent | notpresent | 121.0 | 36.0 | 1.2 | NaN | NaN | 15.4 | 44 | 7800 | 5.2 | yes | yes | no | good | no | no | ckd |
| 1 | 1 | 7.0 | 50.0 | 1.020 | 4.0 | 0.0 | NaN | normal | notpresent | notpresent | NaN | 18.0 | 0.8 | NaN | NaN | 11.3 | 38 | 6000 | NaN | no | no | no | good | no | no | ckd |
| 2 | 2 | 62.0 | 80.0 | 1.010 | 2.0 | 3.0 | normal | normal | notpresent | notpresent | 423.0 | 53.0 | 1.8 | NaN | NaN | 9.6 | 31 | 7500 | NaN | no | yes | no | poor | no | yes | ckd |
| 3 | 3 | 48.0 | 70.0 | 1.005 | 4.0 | 0.0 | normal | abnormal | present | notpresent | 117.0 | 56.0 | 3.8 | 111.0 | 2.5 | 11.2 | 32 | 6700 | 3.9 | yes | no | no | poor | yes | yes | ckd |
| 4 | 4 | 51.0 | 80.0 | 1.010 | 2.0 | 0.0 | normal | normal | notpresent | notpresent | 106.0 | 26.0 | 1.4 | NaN | NaN | 11.6 | 35 | 7300 | 4.6 | no | no | no | good | no | no | ckd |
Before attempting to solve the problem, it's very important to have a good understanding of data.
In this section you will:
As the name says descriptive statistics describes the data. It gives you information about
Let's understand the data we have
# Dimension of the data
df.shape
(400, 26)
# Summary of the dataset
df.describe().T
| count | mean | std | min | 25% | 50% | 75% | max | |
|---|---|---|---|---|---|---|---|---|
| id | 400.0 | 199.500000 | 115.614301 | 0.000 | 99.75 | 199.50 | 299.25 | 399.000 |
| age | 391.0 | 51.483376 | 17.169714 | 2.000 | 42.00 | 55.00 | 64.50 | 90.000 |
| bp | 388.0 | 76.469072 | 13.683637 | 50.000 | 70.00 | 80.00 | 80.00 | 180.000 |
| sg | 353.0 | 1.017408 | 0.005717 | 1.005 | 1.01 | 1.02 | 1.02 | 1.025 |
| al | 354.0 | 1.016949 | 1.352679 | 0.000 | 0.00 | 0.00 | 2.00 | 5.000 |
| su | 351.0 | 0.450142 | 1.099191 | 0.000 | 0.00 | 0.00 | 0.00 | 5.000 |
| bgr | 356.0 | 148.036517 | 79.281714 | 22.000 | 99.00 | 121.00 | 163.00 | 490.000 |
| bu | 381.0 | 57.425722 | 50.503006 | 1.500 | 27.00 | 42.00 | 66.00 | 391.000 |
| sc | 383.0 | 3.072454 | 5.741126 | 0.400 | 0.90 | 1.30 | 2.80 | 76.000 |
| sod | 313.0 | 137.528754 | 10.408752 | 4.500 | 135.00 | 138.00 | 142.00 | 163.000 |
| pot | 312.0 | 4.627244 | 3.193904 | 2.500 | 3.80 | 4.40 | 4.90 | 47.000 |
| hemo | 348.0 | 12.526437 | 2.912587 | 3.100 | 10.30 | 12.65 | 15.00 | 17.800 |
columns=pd.read_csv('data_description.txt',sep='-')
columns=columns.reset_index()
columns.columns=['cols','abb_col_names']
columns
| cols | abb_col_names | |
|---|---|---|
| 0 | id | id |
| 1 | age | age |
| 2 | bp | blood pressure |
| 3 | sg | specific gravity |
| 4 | al | albumin |
| 5 | su | sugar |
| 6 | rbc | red blood cells |
| 7 | pc | pus cell |
| 8 | pcc | pus cell clumps |
| 9 | ba | bacteria |
| 10 | bgr | blood glucose random |
| 11 | bu | blood urea |
| 12 | sc | serum creatinine |
| 13 | sod | sodium |
| 14 | pot | potassium |
| 15 | hemo | haemoglobin |
| 16 | pcv | packed cell volume |
| 17 | wc | white blood cell count |
| 18 | rc | red blood cell count |
| 19 | htn | ypertension |
| 20 | dm | diabetes mellitus |
| 21 | cad | coronary artery disease |
| 22 | appet | appetite |
| 23 | pe | pedal edema |
| 24 | ane | anemia |
| 25 | classification | class |
df.columns=columns['abb_col_names'].values
df.head()
| id | age | blood pressure | specific gravity | albumin | sugar | red blood cells | pus cell | pus cell clumps | bacteria | blood glucose random | blood urea | serum creatinine | sodium | potassium | haemoglobin | packed cell volume | white blood cell count | red blood cell count | ypertension | diabetes mellitus | coronary artery disease | appetite | pedal edema | anemia | class | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 0 | 48.0 | 80.0 | 1.020 | 1.0 | 0.0 | NaN | normal | notpresent | notpresent | 121.0 | 36.0 | 1.2 | NaN | NaN | 15.4 | 44 | 7800 | 5.2 | yes | yes | no | good | no | no | ckd |
| 1 | 1 | 7.0 | 50.0 | 1.020 | 4.0 | 0.0 | NaN | normal | notpresent | notpresent | NaN | 18.0 | 0.8 | NaN | NaN | 11.3 | 38 | 6000 | NaN | no | no | no | good | no | no | ckd |
| 2 | 2 | 62.0 | 80.0 | 1.010 | 2.0 | 3.0 | normal | normal | notpresent | notpresent | 423.0 | 53.0 | 1.8 | NaN | NaN | 9.6 | 31 | 7500 | NaN | no | yes | no | poor | no | yes | ckd |
| 3 | 3 | 48.0 | 70.0 | 1.005 | 4.0 | 0.0 | normal | abnormal | present | notpresent | 117.0 | 56.0 | 3.8 | 111.0 | 2.5 | 11.2 | 32 | 6700 | 3.9 | yes | no | no | poor | yes | yes | ckd |
| 4 | 4 | 51.0 | 80.0 | 1.010 | 2.0 | 0.0 | normal | normal | notpresent | notpresent | 106.0 | 26.0 | 1.4 | NaN | NaN | 11.6 | 35 | 7300 | 4.6 | no | no | no | good | no | no | ckd |
df.dtypes
id int64 age float64 blood pressure float64 specific gravity float64 albumin float64 sugar float64 red blood cells object pus cell object pus cell clumps object bacteria object blood glucose random float64 blood urea float64 serum creatinine float64 sodium float64 potassium float64 haemoglobin float64 packed cell volume object white blood cell count object red blood cell count object ypertension object diabetes mellitus object coronary artery disease object appetite object pedal edema object anemia object class object dtype: object
df.columns
Index(['id', 'age', 'blood pressure', 'specific gravity', 'albumin', 'sugar',
'red blood cells', 'pus cell', 'pus cell clumps', 'bacteria',
'blood glucose random', 'blood urea', 'serum creatinine', 'sodium',
'potassium', 'haemoglobin', 'packed cell volume',
'white blood cell count', 'red blood cell count', 'ypertension',
'diabetes mellitus', 'coronary artery disease', 'appetite',
'pedal edema', 'anemia', 'class'],
dtype='object')
df.columns = df.columns.str.replace(' ','_')
features=['red_blood_cell_count','packed_cell_volume','white_blood_cell_count']
def convert_dtype(df,feature):
df[feature] = pd.to_numeric(df[feature], errors='coerce')
for feature in features:
convert_dtype(df,feature)
df.dtypes
id int64 age float64 blood_pressure float64 specific_gravity float64 albumin float64 sugar float64 red_blood_cells object pus_cell object pus_cell_clumps object bacteria object blood_glucose_random float64 blood_urea float64 serum_creatinine float64 sodium float64 potassium float64 haemoglobin float64 packed_cell_volume float64 white_blood_cell_count float64 red_blood_cell_count float64 ypertension object diabetes_mellitus object coronary_artery_disease object appetite object pedal_edema object anemia object class object dtype: object
The data is not yet ready for model building. You need to process the data and make it ready for model building
In this section you will:
df.head()
| id | age | blood_pressure | specific_gravity | albumin | sugar | red_blood_cells | pus_cell | pus_cell_clumps | bacteria | blood_glucose_random | blood_urea | serum_creatinine | sodium | potassium | haemoglobin | packed_cell_volume | white_blood_cell_count | red_blood_cell_count | ypertension | diabetes_mellitus | coronary_artery_disease | appetite | pedal_edema | anemia | class | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 0 | 48.0 | 80.0 | 1.020 | 1.0 | 0.0 | NaN | normal | notpresent | notpresent | 121.0 | 36.0 | 1.2 | NaN | NaN | 15.4 | 44.0 | 7800.0 | 5.2 | yes | yes | no | good | no | no | ckd |
| 1 | 1 | 7.0 | 50.0 | 1.020 | 4.0 | 0.0 | NaN | normal | notpresent | notpresent | NaN | 18.0 | 0.8 | NaN | NaN | 11.3 | 38.0 | 6000.0 | NaN | no | no | no | good | no | no | ckd |
| 2 | 2 | 62.0 | 80.0 | 1.010 | 2.0 | 3.0 | normal | normal | notpresent | notpresent | 423.0 | 53.0 | 1.8 | NaN | NaN | 9.6 | 31.0 | 7500.0 | NaN | no | yes | no | poor | no | yes | ckd |
| 3 | 3 | 48.0 | 70.0 | 1.005 | 4.0 | 0.0 | normal | abnormal | present | notpresent | 117.0 | 56.0 | 3.8 | 111.0 | 2.5 | 11.2 | 32.0 | 6700.0 | 3.9 | yes | no | no | poor | yes | yes | ckd |
| 4 | 4 | 51.0 | 80.0 | 1.010 | 2.0 | 0.0 | normal | normal | notpresent | notpresent | 106.0 | 26.0 | 1.4 | NaN | NaN | 11.6 | 35.0 | 7300.0 | 4.6 | no | no | no | good | no | no | ckd |
It's better to get the list of columns by data types in the start itself. You won't have to manually write the name of columns while performing certain operations. So always get the list of columns in the start itself.
# Remove extra columns
col_remove = input_drop_col
df = df.drop(col_remove, axis = 1)
# Get the list of numeric and categorical columns according to the input
if input_datatype_selection == "auto":
binary_columns = [col for col in df.columns if df[col].nunique() == 2]
print("Binary Columns : ", binary_columns)
categorical_columns = [col for col in df.columns if df[col].dtype == "object"]
print("Categorical Columns : ", categorical_columns)
categorical_columns = binary_columns + categorical_columns
categorical_columns = list(set(categorical_columns))
numerical_columns = [col for col in df.columns if col not in categorical_columns]
print("Numerical Columns : ", numerical_columns)
else:
categorical_columns = input_cat_columns
print("Categorical Columns : ", categorical_columns)
numerical_columns = input_num_columns
print("Numerical Columns : ", numerical_columns)
Binary Columns : ['red_blood_cells', 'pus_cell', 'pus_cell_clumps', 'bacteria', 'ypertension', 'appetite', 'pedal_edema', 'anemia'] Categorical Columns : ['red_blood_cells', 'pus_cell', 'pus_cell_clumps', 'bacteria', 'ypertension', 'diabetes_mellitus', 'coronary_artery_disease', 'appetite', 'pedal_edema', 'anemia', 'class'] Numerical Columns : ['age', 'blood_pressure', 'specific_gravity', 'albumin', 'sugar', 'blood_glucose_random', 'blood_urea', 'serum_creatinine', 'sodium', 'potassium', 'haemoglobin', 'packed_cell_volume', 'white_blood_cell_count', 'red_blood_cell_count']
### total unique categories in our categorical features to check if any dirtiness in data or not
for col in categorical_columns:
print('{} has {} categories'.format(col, df[col].unique()))
bacteria has ['notpresent' 'present' nan] categories pus_cell_clumps has ['notpresent' 'present' nan] categories pedal_edema has ['no' 'yes' nan] categories class has ['ckd' 'ckd\t' 'notckd'] categories coronary_artery_disease has ['no' 'yes' '\tno' nan] categories pus_cell has ['normal' 'abnormal' nan] categories red_blood_cells has [nan 'normal' 'abnormal'] categories ypertension has ['yes' 'no' nan] categories anemia has ['no' 'yes' nan] categories diabetes_mellitus has ['yes' 'no' ' yes' '\tno' '\tyes' nan] categories appetite has ['good' 'poor' nan] categories
#Replace incorrect values
df['diabetes_mellitus'].replace(to_replace = {'\tno':'no','\tyes':'yes',' yes':'yes'},inplace=True)
df['coronary_artery_disease'] = df['coronary_artery_disease'].replace(to_replace = '\tno', value='no')
df['class'] = df['class'].replace(to_replace = 'ckd\t', value = 'ckd')
df.head()
| age | blood_pressure | specific_gravity | albumin | sugar | red_blood_cells | pus_cell | pus_cell_clumps | bacteria | blood_glucose_random | blood_urea | serum_creatinine | sodium | potassium | haemoglobin | packed_cell_volume | white_blood_cell_count | red_blood_cell_count | ypertension | diabetes_mellitus | coronary_artery_disease | appetite | pedal_edema | anemia | class | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 48.0 | 80.0 | 1.020 | 1.0 | 0.0 | NaN | normal | notpresent | notpresent | 121.0 | 36.0 | 1.2 | NaN | NaN | 15.4 | 44.0 | 7800.0 | 5.2 | yes | yes | no | good | no | no | ckd |
| 1 | 7.0 | 50.0 | 1.020 | 4.0 | 0.0 | NaN | normal | notpresent | notpresent | NaN | 18.0 | 0.8 | NaN | NaN | 11.3 | 38.0 | 6000.0 | NaN | no | no | no | good | no | no | ckd |
| 2 | 62.0 | 80.0 | 1.010 | 2.0 | 3.0 | normal | normal | notpresent | notpresent | 423.0 | 53.0 | 1.8 | NaN | NaN | 9.6 | 31.0 | 7500.0 | NaN | no | yes | no | poor | no | yes | ckd |
| 3 | 48.0 | 70.0 | 1.005 | 4.0 | 0.0 | normal | abnormal | present | notpresent | 117.0 | 56.0 | 3.8 | 111.0 | 2.5 | 11.2 | 32.0 | 6700.0 | 3.9 | yes | no | no | poor | yes | yes | ckd |
| 4 | 51.0 | 80.0 | 1.010 | 2.0 | 0.0 | normal | normal | notpresent | notpresent | 106.0 | 26.0 | 1.4 | NaN | NaN | 11.6 | 35.0 | 7300.0 | 4.6 | no | no | no | good | no | no | ckd |
Get the info about missing values in the dataframe
# Missing values for every column
df.isna().sum()
age 9 blood_pressure 12 specific_gravity 47 albumin 46 sugar 49 red_blood_cells 152 pus_cell 65 pus_cell_clumps 4 bacteria 4 blood_glucose_random 44 blood_urea 19 serum_creatinine 17 sodium 87 potassium 88 haemoglobin 52 packed_cell_volume 71 white_blood_cell_count 106 red_blood_cell_count 131 ypertension 2 diabetes_mellitus 2 coronary_artery_disease 2 appetite 1 pedal_edema 1 anemia 1 class 0 dtype: int64
Exploratory data analysis is an approach to analyze or investigate data sets to find out patterns and see if any of the variables can be useful in predicting the y variables. Visual methods are often used to summarise the data. Primarily EDA is for seeing what the data can tell us beyond the formal modeling or hypothesis testing tasks.
In this section you will:
Note : There might be some mismatch in the data type of the columns, so in such cases you will have to correct it manually
You need to check the distribution of target class, see how many categories are there, is it balanced or not
# Check distribution of target class
sns.countplot(y=df[input_target_class] ,data=df)
plt.xlabel("Count of each Target class")
plt.ylabel("Target classes")
plt.show()
# Check the distribution of all the features
df.hist(figsize=(15,12),bins = 15)
plt.title("Features Distribution")
plt.show()
# box and whisker plots
df.plot(figsize=(15, 15), kind='box', subplots=True, layout=(8,8), sharex=False, sharey=False, fontsize=1)
plt.show()
# Heatmap of correlation between features
plt.figure(figsize = (20, 20))
sns.heatmap(df.drop(['class'], axis = 1).corr(), annot = True)
<AxesSubplot:>
Dive deeper on correlations, since some correlations could determine target feature(class) and demonstrated by visualization
Positive Correlation: specific_gravity -> red_blood_cell_count, packed_cell_volume, haemoglobin sugar -> blood_glucose_random blood_urea -> serum_creatinine haemoglobin -> red_blood_cell_count <- packed_cell_volume
Negative Correlation: Albumin, Blood urea -> Red blood cell count, packed cell volume, Haemoglobin Serum creatinine -> Sodium
# define kde_plot
def kde_plot(feature):
grid = sns.FacetGrid(df, hue="class",aspect=2)
grid.map(sns.kdeplot, feature)
grid.add_legend()
pos_features = df[['specific_gravity', 'red_blood_cell_count', 'packed_cell_volume', 'haemoglobin', 'sugar', 'blood_glucose_random',
'blood_urea', 'serum_creatinine']]
pos_features
| specific_gravity | red_blood_cell_count | packed_cell_volume | haemoglobin | sugar | blood_glucose_random | blood_urea | serum_creatinine | |
|---|---|---|---|---|---|---|---|---|
| 0 | 1.020 | 5.2 | 44.0 | 15.4 | 0.0 | 121.0 | 36.0 | 1.20 |
| 1 | 1.020 | NaN | 38.0 | 11.3 | 0.0 | NaN | 18.0 | 0.80 |
| 2 | 1.010 | NaN | 31.0 | 9.6 | 3.0 | 423.0 | 53.0 | 1.80 |
| 3 | 1.005 | 3.9 | 32.0 | 11.2 | 0.0 | 117.0 | 56.0 | 3.80 |
| 4 | 1.010 | 4.6 | 35.0 | 11.6 | 0.0 | 106.0 | 26.0 | 1.40 |
| 5 | 1.015 | 4.4 | 39.0 | 12.2 | 0.0 | 74.0 | 25.0 | 1.10 |
| 6 | 1.010 | NaN | 36.0 | 12.4 | 0.0 | 100.0 | 54.0 | 24.00 |
| 7 | 1.015 | 5.0 | 44.0 | 12.4 | 4.0 | 410.0 | 31.0 | 1.10 |
| 8 | 1.015 | 4.0 | 33.0 | 10.8 | 0.0 | 138.0 | 60.0 | 1.90 |
| 9 | 1.020 | 3.7 | 29.0 | 9.5 | 0.0 | 70.0 | 107.0 | 7.20 |
| 10 | 1.010 | NaN | 28.0 | 9.4 | 4.0 | 490.0 | 55.0 | 4.00 |
| 11 | 1.010 | 3.8 | 32.0 | 10.8 | 0.0 | 380.0 | 60.0 | 2.70 |
| 12 | 1.015 | 3.4 | 28.0 | 9.7 | 1.0 | 208.0 | 72.0 | 2.10 |
| 13 | NaN | NaN | NaN | 9.8 | NaN | 98.0 | 86.0 | 4.60 |
| 14 | 1.010 | 2.6 | 16.0 | 5.6 | 2.0 | 157.0 | 90.0 | 4.10 |
| 15 | 1.015 | 2.8 | 24.0 | 7.6 | 0.0 | 76.0 | 162.0 | 9.60 |
| 16 | 1.015 | NaN | NaN | 12.6 | 0.0 | 99.0 | 46.0 | 2.20 |
| 17 | NaN | NaN | NaN | 12.1 | NaN | 114.0 | 87.0 | 5.20 |
| 18 | 1.025 | 4.3 | 37.0 | 12.7 | 3.0 | 263.0 | 27.0 | 1.30 |
| 19 | 1.015 | 3.7 | 30.0 | 10.3 | 0.0 | 100.0 | 31.0 | 1.60 |
| 20 | 1.015 | 3.2 | 24.0 | 7.7 | 0.0 | 173.0 | 148.0 | 3.90 |
| 21 | NaN | 3.6 | 32.0 | 10.9 | NaN | NaN | 180.0 | 76.00 |
| 22 | 1.025 | 3.4 | 32.0 | 9.8 | 0.0 | 95.0 | 163.0 | 7.70 |
| 23 | 1.010 | NaN | NaN | NaN | 0.0 | NaN | NaN | NaN |
| 24 | 1.015 | 4.6 | 39.0 | 11.1 | 0.0 | NaN | 50.0 | 1.40 |
| 25 | 1.025 | 3.7 | 29.0 | 9.9 | 0.0 | 108.0 | 75.0 | 1.90 |
| 26 | 1.015 | 4.0 | 35.0 | 11.6 | 0.0 | 156.0 | 45.0 | 2.40 |
| 27 | 1.010 | 4.1 | 37.0 | 12.5 | 4.0 | 264.0 | 87.0 | 2.70 |
| 28 | NaN | NaN | NaN | NaN | 3.0 | 123.0 | 31.0 | 1.40 |
| 29 | 1.005 | NaN | 38.0 | 12.9 | 0.0 | NaN | 28.0 | 1.40 |
| 30 | NaN | NaN | NaN | NaN | NaN | 93.0 | 155.0 | 7.30 |
| 31 | 1.015 | 4.0 | 30.0 | 10.1 | 0.0 | 107.0 | 33.0 | 1.50 |
| 32 | 1.010 | 4.0 | 34.0 | 11.3 | 1.0 | 159.0 | 39.0 | 1.50 |
| 33 | 1.020 | NaN | 29.0 | 10.1 | 0.0 | 140.0 | 55.0 | 2.50 |
| 34 | 1.010 | NaN | NaN | NaN | 0.0 | 171.0 | 153.0 | 5.20 |
| 35 | 1.020 | 4.9 | 36.0 | 12.0 | 1.0 | 270.0 | 39.0 | 2.00 |
| 36 | 1.015 | NaN | 32.0 | 10.3 | 0.0 | 92.0 | 29.0 | 1.80 |
| 37 | NaN | 2.5 | 28.0 | 9.7 | NaN | 137.0 | 65.0 | 3.40 |
| 38 | 1.020 | NaN | NaN | 12.5 | 0.0 | NaN | 103.0 | 4.10 |
| 39 | 1.010 | 4.2 | 40.0 | 13.0 | 2.0 | 140.0 | 70.0 | 3.40 |
| 40 | 1.010 | 4.1 | 32.0 | 11.1 | 0.0 | 99.0 | 80.0 | 2.10 |
| 41 | 1.010 | NaN | NaN | NaN | 0.0 | NaN | 20.0 | 0.70 |
| 42 | 1.010 | 4.5 | 33.0 | 9.7 | 0.0 | 204.0 | 29.0 | 1.00 |
| 43 | 1.010 | 3.1 | 24.0 | 7.9 | 0.0 | 79.0 | 202.0 | 10.80 |
| 44 | 1.010 | NaN | 28.0 | 9.7 | 0.0 | 207.0 | 77.0 | 6.30 |
| 45 | 1.020 | NaN | NaN | 9.3 | 0.0 | 208.0 | 89.0 | 5.90 |
| 46 | 1.015 | 4.7 | 37.0 | 12.4 | 0.0 | 124.0 | 24.0 | 1.20 |
| 47 | 1.010 | NaN | 45.0 | 15.0 | 0.0 | NaN | 17.0 | 0.80 |
| 48 | 1.005 | 3.5 | 29.0 | 10.0 | 0.0 | 70.0 | 32.0 | 0.90 |
| 49 | 1.010 | 3.5 | 29.0 | 9.7 | 0.0 | 144.0 | 72.0 | 3.00 |
| 50 | NaN | 3.8 | 28.0 | 8.6 | NaN | 91.0 | 114.0 | 3.25 |
| 51 | 1.015 | NaN | 33.0 | 10.3 | 0.0 | 162.0 | 66.0 | 1.60 |
| 52 | 1.015 | 3.7 | 34.0 | 10.9 | 0.0 | NaN | 38.0 | 2.20 |
| 53 | 1.015 | 4.7 | 40.0 | 13.6 | 5.0 | 246.0 | 24.0 | 1.00 |
| 54 | 1.010 | 4.2 | 40.0 | 13.0 | 2.0 | NaN | NaN | 3.40 |
| 55 | 1.005 | NaN | 28.0 | 9.5 | 0.0 | NaN | NaN | NaN |
| 56 | 1.015 | 3.4 | 30.0 | 10.2 | 4.0 | NaN | 164.0 | 9.70 |
| 57 | NaN | NaN | NaN | NaN | NaN | 93.0 | 155.0 | 7.30 |
| 58 | 1.020 | 4.3 | 33.0 | 10.5 | 0.0 | 253.0 | 142.0 | 4.60 |
| 59 | NaN | NaN | NaN | 6.6 | NaN | NaN | 96.0 | 6.40 |
| 60 | 1.020 | NaN | NaN | NaN | 0.0 | 141.0 | 66.0 | 3.20 |
| 61 | 1.010 | NaN | NaN | NaN | 3.0 | 182.0 | 391.0 | 32.00 |
| 62 | 1.020 | 3.8 | 33.0 | 11.0 | 0.0 | 86.0 | 15.0 | 0.60 |
| 63 | 1.015 | NaN | 27.0 | 7.5 | 0.0 | 150.0 | 111.0 | 6.10 |
| 64 | 1.010 | NaN | NaN | 9.8 | 0.0 | 146.0 | NaN | NaN |
| 65 | 1.010 | NaN | 48.0 | 15.0 | 0.0 | NaN | 20.0 | 1.10 |
| 66 | 1.020 | NaN | NaN | NaN | 0.0 | 150.0 | 55.0 | 1.60 |
| 67 | 1.020 | NaN | NaN | NaN | 0.0 | 425.0 | NaN | NaN |
| 68 | 1.010 | NaN | 37.0 | 10.9 | 0.0 | 112.0 | 73.0 | 3.30 |
| 69 | 1.015 | 6.0 | 52.0 | 15.6 | 4.0 | 250.0 | 20.0 | 1.10 |
| 70 | 1.015 | 5.2 | 44.0 | 15.2 | 4.0 | 360.0 | 19.0 | 0.70 |
| 71 | 1.010 | 3.2 | 28.0 | 9.8 | 0.0 | 163.0 | 92.0 | 3.30 |
| 72 | 1.010 | NaN | NaN | 10.3 | 3.0 | NaN | 35.0 | 1.30 |
| 73 | 1.015 | NaN | 14.0 | 4.8 | 0.0 | 129.0 | 107.0 | 6.70 |
| 74 | 1.015 | 3.4 | 29.0 | 9.1 | 0.0 | 129.0 | 107.0 | 6.70 |
| 75 | 1.015 | NaN | NaN | 8.1 | 0.0 | NaN | 16.0 | 0.70 |
| 76 | 1.005 | 4.0 | 36.0 | 10.3 | 0.0 | 133.0 | 139.0 | 8.50 |
| 77 | 1.010 | 3.7 | 34.0 | 11.9 | 0.0 | 102.0 | 48.0 | 3.20 |
| 78 | NaN | NaN | 30.0 | 10.1 | NaN | 158.0 | 85.0 | 3.20 |
| 79 | 1.010 | 5.0 | 40.0 | 13.5 | 0.0 | 165.0 | 55.0 | 1.80 |
| 80 | 1.010 | 3.8 | 31.0 | 10.8 | 0.0 | 132.0 | 98.0 | 2.80 |
| 81 | NaN | 3.7 | 29.0 | 8.3 | NaN | 360.0 | 45.0 | 2.40 |
| 82 | NaN | NaN | NaN | NaN | NaN | 104.0 | 77.0 | 1.90 |
| 83 | 1.015 | NaN | NaN | NaN | 0.0 | 127.0 | 19.0 | 1.00 |
| 84 | 1.010 | 2.1 | 22.0 | 7.1 | 0.0 | 76.0 | 186.0 | 15.00 |
| 85 | 1.015 | NaN | NaN | 9.9 | NaN | NaN | 46.0 | 1.50 |
| 86 | NaN | NaN | NaN | NaN | NaN | 415.0 | 37.0 | 1.90 |
| 87 | 1.005 | 5.0 | 32.0 | 11.1 | 0.0 | 169.0 | 47.0 | 2.90 |
| 88 | 1.010 | 4.7 | NaN | NaN | 0.0 | 251.0 | 52.0 | 2.20 |
| 89 | 1.020 | NaN | NaN | NaN | 0.0 | 109.0 | 32.0 | 1.40 |
| 90 | 1.010 | 4.2 | 40.0 | 13.0 | 2.0 | 280.0 | 35.0 | 3.20 |
| 91 | 1.015 | 5.6 | 52.0 | 16.1 | 1.0 | 210.0 | 26.0 | 1.70 |
| 92 | 1.010 | 3.6 | 33.0 | 10.4 | 0.0 | 219.0 | 82.0 | 3.60 |
| 93 | 1.010 | 3.2 | 30.0 | 9.2 | 2.0 | 295.0 | 90.0 | 5.60 |
| 94 | 1.010 | 3.9 | 36.0 | 11.6 | 0.0 | 93.0 | 66.0 | 1.60 |
| 95 | 1.015 | NaN | NaN | NaN | 0.0 | 94.0 | 25.0 | 1.10 |
| 96 | 1.010 | NaN | 36.0 | 11.2 | 1.0 | 172.0 | 32.0 | 2.70 |
| 97 | 1.015 | 4.0 | 32.0 | 10.0 | 0.0 | 91.0 | 51.0 | 2.20 |
| 98 | NaN | 2.3 | 18.0 | 6.2 | NaN | 101.0 | 106.0 | 6.50 |
| 99 | NaN | 4.2 | 32.0 | 11.2 | 4.0 | 298.0 | 24.0 | 1.20 |
| 100 | 1.015 | NaN | NaN | NaN | 0.0 | 153.0 | 22.0 | 0.90 |
| 101 | 1.015 | 3.9 | 33.0 | 11.3 | 0.0 | 88.0 | 80.0 | 4.40 |
| 102 | 1.010 | NaN | 52.0 | 13.9 | 0.0 | 92.0 | 32.0 | 2.10 |
| 103 | 1.015 | 4.2 | 36.0 | 10.2 | 0.0 | 226.0 | 217.0 | 10.20 |
| 104 | NaN | NaN | NaN | NaN | NaN | 143.0 | 88.0 | 2.00 |
| 105 | 1.015 | 5.2 | 42.0 | 14.1 | 0.0 | 115.0 | 32.0 | 11.50 |
| 106 | NaN | NaN | 17.0 | 6.0 | NaN | 89.0 | 118.0 | 6.10 |
| 107 | 1.015 | 4.4 | 34.0 | 11.2 | 4.0 | 297.0 | 53.0 | 2.80 |
| 108 | 1.015 | 4.2 | 37.0 | 11.8 | 0.0 | 107.0 | 15.0 | 1.00 |
| 109 | NaN | NaN | NaN | 11.7 | NaN | 233.0 | 50.1 | 1.90 |
| 110 | 1.015 | 4.7 | 34.0 | 11.7 | 0.0 | 123.0 | 19.0 | 2.00 |
| 111 | 1.010 | 3.9 | 32.0 | 10.0 | 3.0 | 294.0 | 71.0 | 4.40 |
| 112 | 1.015 | NaN | 33.0 | 10.8 | 0.0 | NaN | 34.0 | 1.20 |
| 113 | 1.015 | NaN | NaN | NaN | 2.0 | NaN | NaN | NaN |
| 114 | 1.015 | NaN | NaN | 12.1 | 0.0 | NaN | 51.0 | 1.80 |
| 115 | 1.010 | 4.3 | 44.0 | 12.4 | 0.0 | NaN | 28.0 | 0.90 |
| 116 | 1.015 | NaN | NaN | NaN | 0.0 | 104.0 | 16.0 | 0.50 |
| 117 | 1.020 | 4.4 | 37.0 | 12.5 | 0.0 | 219.0 | 36.0 | 1.30 |
| 118 | 1.010 | NaN | NaN | 11.4 | 0.0 | 99.0 | 25.0 | 1.20 |
| 119 | 1.010 | NaN | NaN | NaN | 0.0 | 140.0 | 27.0 | 1.20 |
| 120 | 1.025 | NaN | NaN | 12.6 | 3.0 | 323.0 | 40.0 | 2.20 |
| 121 | NaN | NaN | 46.0 | 15.0 | NaN | 125.0 | 21.0 | 1.30 |
| 122 | NaN | NaN | NaN | 6.0 | NaN | NaN | 219.0 | 12.20 |
| 123 | 1.015 | NaN | 42.0 | 14.0 | 3.0 | NaN | 30.0 | 1.10 |
| 124 | 1.015 | 3.6 | 28.0 | 9.1 | 0.0 | 90.0 | 98.0 | 2.50 |
| 125 | NaN | NaN | NaN | NaN | NaN | 308.0 | 36.0 | 2.50 |
| 126 | 1.015 | 4.5 | 37.0 | 12.0 | 0.0 | 144.0 | 125.0 | 4.00 |
| 127 | 1.015 | 4.3 | 35.0 | 11.4 | 0.0 | 118.0 | 125.0 | 5.30 |
| 128 | 1.015 | 2.9 | 23.0 | 8.1 | 3.0 | 224.0 | 166.0 | 5.60 |
| 129 | 1.025 | NaN | NaN | 11.1 | 0.0 | 158.0 | 49.0 | 1.40 |
| 130 | 1.010 | 2.7 | 22.0 | 8.2 | 0.0 | 128.0 | 208.0 | 9.20 |
| 131 | 1.010 | NaN | 36.0 | 11.8 | 0.0 | NaN | 25.0 | 0.60 |
| 132 | NaN | 2.7 | 24.0 | 8.6 | NaN | 219.0 | 176.0 | 13.80 |
| 133 | 1.015 | 8.0 | 37.0 | 12.0 | 0.0 | 118.0 | 125.0 | 5.30 |
| 134 | 1.010 | 3.8 | 33.0 | 10.8 | NaN | 122.0 | NaN | 16.90 |
| 135 | 1.015 | NaN | 39.0 | 13.2 | 2.0 | 214.0 | 24.0 | 1.30 |
| 136 | 1.020 | NaN | NaN | 9.3 | NaN | 213.0 | 68.0 | 2.80 |
| 137 | 1.010 | NaN | 29.0 | 10.0 | 0.0 | 268.0 | 86.0 | 4.00 |
| 138 | 1.010 | NaN | NaN | NaN | 0.0 | 95.0 | 51.0 | 1.60 |
| 139 | 1.015 | NaN | 33.0 | 11.1 | 0.0 | NaN | 68.0 | 2.80 |
| 140 | 1.010 | NaN | NaN | NaN | 4.0 | 256.0 | 40.0 | 1.20 |
| 141 | 1.010 | NaN | 19.0 | 6.1 | 0.0 | NaN | 106.0 | 6.00 |
| 142 | NaN | NaN | NaN | NaN | NaN | 84.0 | 145.0 | 7.10 |
| 143 | 1.015 | NaN | NaN | NaN | 4.0 | 210.0 | 165.0 | 18.00 |
| 144 | 1.010 | 4.1 | 33.0 | 11.1 | 0.0 | 105.0 | 53.0 | 2.30 |
| 145 | 1.015 | 3.3 | 24.0 | 8.0 | 0.0 | NaN | 322.0 | 13.00 |
| 146 | 1.010 | NaN | NaN | NaN | 3.0 | 213.0 | 23.0 | 1.00 |
| 147 | 1.010 | 3.0 | 25.0 | 7.9 | 1.0 | 288.0 | 36.0 | 1.70 |
| 148 | NaN | NaN | NaN | NaN | NaN | 171.0 | 26.0 | 48.10 |
| 149 | 1.020 | NaN | 32.0 | 10.5 | 0.0 | 139.0 | 29.0 | 1.00 |
| 150 | 1.025 | NaN | 41.0 | 12.3 | 0.0 | 78.0 | 27.0 | 0.90 |
| 151 | NaN | NaN | 30.0 | 9.6 | NaN | 172.0 | 46.0 | 1.70 |
| 152 | 1.010 | NaN | 32.0 | 10.9 | 0.0 | 121.0 | 20.0 | 0.80 |
| 153 | 1.010 | 2.9 | 22.0 | 8.3 | 1.0 | 273.0 | 235.0 | 14.20 |
| 154 | 1.005 | 3.0 | 26.0 | 8.4 | 3.0 | 242.0 | 132.0 | 16.40 |
| 155 | 1.020 | NaN | 36.0 | 11.1 | 0.0 | 123.0 | 40.0 | 1.80 |
| 156 | 1.015 | NaN | NaN | NaN | 0.0 | 153.0 | 76.0 | 3.30 |
| 157 | 1.025 | 3.9 | 39.0 | 12.6 | 0.0 | 122.0 | 42.0 | 1.70 |
| 158 | 1.020 | NaN | 31.0 | 10.9 | 2.0 | 424.0 | 48.0 | 1.50 |
| 159 | 1.010 | 4.3 | 35.0 | 10.4 | 0.0 | 303.0 | 35.0 | 1.30 |
| 160 | NaN | 2.4 | 35.0 | 10.9 | NaN | 148.0 | 39.0 | 2.10 |
| 161 | 1.015 | 4.8 | 42.0 | 14.3 | 0.0 | NaN | NaN | NaN |
| 162 | NaN | NaN | 37.0 | 9.8 | NaN | 204.0 | 34.0 | 1.50 |
| 163 | 1.010 | 3.2 | 27.0 | 9.0 | 0.0 | 160.0 | 40.0 | 2.00 |
| 164 | 1.015 | 5.4 | 40.0 | 14.3 | 0.0 | 192.0 | 15.0 | 0.80 |
| 165 | 1.020 | NaN | NaN | NaN | 2.0 | NaN | NaN | NaN |
| 166 | NaN | NaN | NaN | NaN | NaN | 76.0 | 44.0 | 3.90 |
| 167 | 1.020 | NaN | 42.0 | 12.7 | 0.0 | 139.0 | 19.0 | 0.90 |
| 168 | 1.015 | NaN | 39.0 | 11.0 | 4.0 | 307.0 | 28.0 | 1.50 |
| 169 | 1.010 | NaN | 27.0 | 8.7 | 2.0 | 220.0 | 68.0 | 2.80 |
| 170 | 1.015 | 4.4 | 33.0 | 12.5 | 5.0 | 447.0 | 41.0 | 1.70 |
| 171 | 1.020 | 3.1 | 26.0 | 8.7 | 0.0 | 102.0 | 60.0 | 2.60 |
| 172 | 1.010 | 4.9 | 34.0 | 10.6 | 2.0 | 309.0 | 113.0 | 2.90 |
| 173 | 1.015 | NaN | 41.0 | 13.1 | 0.0 | 22.0 | 1.5 | 7.30 |
| 174 | NaN | 4.6 | 35.0 | 11.0 | NaN | 111.0 | 146.0 | 7.50 |
| 175 | 1.010 | 3.4 | NaN | NaN | 0.0 | 261.0 | 58.0 | 2.20 |
| 176 | 1.010 | 3.9 | 23.0 | 8.3 | 0.0 | 107.0 | 40.0 | 1.70 |
| 177 | 1.015 | NaN | 41.0 | 13.2 | 1.0 | 215.0 | 133.0 | 2.50 |
| 178 | 1.020 | NaN | 34.0 | 9.8 | 0.0 | 93.0 | 153.0 | 2.70 |
| 179 | 1.010 | NaN | 39.0 | 11.9 | 0.0 | 124.0 | 53.0 | 2.30 |
| 180 | 1.010 | NaN | 28.0 | 10.3 | 4.0 | 234.0 | 56.0 | 1.90 |
| 181 | 1.025 | 3.7 | 30.0 | 10.0 | 0.0 | 117.0 | 52.0 | 2.20 |
| 182 | 1.020 | NaN | 35.0 | 11.3 | 0.0 | 131.0 | 23.0 | 0.80 |
| 183 | 1.015 | NaN | NaN | NaN | 0.0 | 101.0 | 106.0 | 6.50 |
| 184 | 1.015 | 3.6 | 31.0 | 11.3 | 2.0 | 352.0 | 137.0 | 3.30 |
| 185 | 1.020 | NaN | 34.0 | 12.0 | 0.0 | 99.0 | 23.0 | 0.60 |
| 186 | 1.020 | NaN | NaN | NaN | 0.0 | NaN | 46.0 | 1.00 |
| 187 | 1.010 | NaN | 34.0 | 10.7 | 0.0 | NaN | 22.0 | 0.70 |
| 188 | NaN | NaN | 38.0 | 12.2 | NaN | 80.0 | 66.0 | 2.50 |
| 189 | 1.010 | 3.4 | 29.0 | 9.5 | 1.0 | 239.0 | 58.0 | 4.30 |
| 190 | 1.010 | 4.8 | 30.0 | 9.9 | 0.0 | 94.0 | 67.0 | 1.00 |
| 191 | 1.010 | 3.4 | 26.0 | 9.1 | 0.0 | 110.0 | 115.0 | 6.00 |
| 192 | 1.015 | NaN | NaN | NaN | 0.0 | 130.0 | 16.0 | 0.90 |
| 193 | 1.025 | 2.8 | 15.0 | 5.5 | 0.0 | NaN | 223.0 | 18.10 |
| 194 | 1.010 | NaN | NaN | NaN | NaN | NaN | 49.0 | 1.20 |
| 195 | 1.020 | NaN | NaN | 5.8 | 1.0 | 184.0 | 98.6 | 3.30 |
| 196 | 1.010 | 3.5 | 24.0 | 8.1 | 0.0 | 129.0 | 158.0 | 11.80 |
| 197 | NaN | 3.0 | NaN | 6.8 | NaN | NaN | 111.0 | 9.30 |
| 198 | 1.020 | 3.9 | 30.0 | 11.2 | 2.0 | 252.0 | 40.0 | 3.20 |
| 199 | 1.015 | 3.2 | 25.0 | 8.8 | 0.0 | 92.0 | 37.0 | 1.50 |
| 200 | 1.025 | 3.9 | 37.0 | 12.0 | 0.0 | 139.0 | 89.0 | 3.00 |
| 201 | NaN | NaN | 21.0 | 7.9 | NaN | 113.0 | 94.0 | 7.30 |
| 202 | NaN | NaN | 24.0 | 8.0 | NaN | 114.0 | 74.0 | 2.90 |
| 203 | NaN | NaN | NaN | 8.5 | NaN | 207.0 | 80.0 | 6.80 |
| 204 | 1.010 | NaN | 31.0 | 8.8 | 2.0 | 172.0 | 82.0 | 13.50 |
| 205 | NaN | NaN | 43.0 | 12.6 | NaN | 100.0 | 28.0 | 2.10 |
| 206 | 1.010 | NaN | 41.0 | 13.8 | 0.0 | 109.0 | 96.0 | 3.90 |
| 207 | 1.010 | 4.6 | 41.0 | 12.0 | 0.0 | 230.0 | 50.0 | 2.20 |
| 208 | NaN | 4.9 | 41.0 | 12.3 | NaN | 341.0 | 37.0 | 1.50 |
| 209 | 1.020 | NaN | NaN | 11.5 | 0.0 | NaN | NaN | NaN |
| 210 | 1.015 | 3.9 | 20.0 | 7.3 | 2.0 | 255.0 | 132.0 | 12.80 |
| 211 | 1.015 | NaN | NaN | NaN | 0.0 | 103.0 | 18.0 | 1.20 |
| 212 | 1.015 | 3.4 | 31.0 | 10.9 | 4.0 | 253.0 | 150.0 | 11.90 |
| 213 | 1.010 | 3.7 | 34.0 | 10.9 | 1.0 | 214.0 | 73.0 | 3.90 |
| 214 | 1.015 | 5.2 | 43.0 | 13.7 | 0.0 | 171.0 | 30.0 | 1.00 |
| 215 | 1.010 | NaN | NaN | NaN | 0.0 | NaN | NaN | NaN |
| 216 | 1.010 | NaN | 38.0 | 12.8 | 0.0 | 107.0 | 15.0 | NaN |
| 217 | 1.010 | 4.3 | 36.0 | 12.2 | 0.0 | 78.0 | 61.0 | 1.80 |
| 218 | 1.015 | NaN | 34.0 | 11.8 | 0.0 | 92.0 | 19.0 | 0.80 |
| 219 | 1.010 | 3.3 | 28.0 | 9.8 | 0.0 | 238.0 | 57.0 | 2.50 |
| 220 | 1.010 | NaN | 36.0 | 11.9 | 0.0 | 103.0 | NaN | NaN |
| 221 | 1.020 | NaN | NaN | NaN | 0.0 | 248.0 | 30.0 | 1.70 |
| 222 | NaN | NaN | NaN | NaN | NaN | 108.0 | 68.0 | 1.80 |
| 223 | 1.010 | 4.6 | 38.0 | 13.0 | 3.0 | 303.0 | 30.0 | 1.30 |
| 224 | 1.020 | NaN | NaN | NaN | 0.0 | 117.0 | 28.0 | 2.20 |
| 225 | 1.010 | 4.5 | 35.0 | 11.5 | 5.0 | 490.0 | 95.0 | 2.70 |
| 226 | 1.015 | 3.4 | 26.0 | 7.9 | 2.0 | 163.0 | 54.0 | 7.20 |
| 227 | 1.015 | 3.8 | 36.0 | 11.3 | 0.0 | 120.0 | 48.0 | 1.60 |
| 228 | NaN | NaN | NaN | NaN | NaN | 124.0 | 52.0 | 2.50 |
| 229 | 1.010 | 3.8 | 31.0 | 9.6 | 0.0 | 241.0 | 191.0 | 12.00 |
| 230 | 1.010 | NaN | NaN | NaN | 0.0 | 192.0 | 17.0 | 1.70 |
| 231 | NaN | NaN | 35.0 | 11.5 | NaN | 269.0 | 51.0 | 2.80 |
| 232 | 1.015 | NaN | NaN | NaN | 0.0 | NaN | NaN | NaN |
| 233 | 1.015 | NaN | NaN | NaN | 0.0 | 93.0 | 20.0 | 1.60 |
| 234 | 1.010 | 5.2 | 44.0 | 15.0 | 0.0 | NaN | 19.0 | 1.30 |
| 235 | 1.010 | NaN | 26.0 | 7.9 | 0.0 | 113.0 | 93.0 | 2.30 |
| 236 | NaN | NaN | 25.0 | 9.1 | NaN | 74.0 | 66.0 | 2.00 |
| 237 | 1.015 | NaN | 40.0 | 12.7 | 2.0 | 141.0 | 53.0 | 2.20 |
| 238 | NaN | NaN | 28.0 | 9.4 | NaN | 201.0 | 241.0 | 13.40 |
| 239 | 1.015 | NaN | 39.0 | 11.9 | 0.0 | 104.0 | 50.0 | 1.60 |
| 240 | 1.015 | 4.1 | 36.0 | 11.4 | 0.0 | 203.0 | 46.0 | 1.40 |
| 241 | 1.015 | 3.9 | 31.0 | 10.4 | 0.0 | 165.0 | 45.0 | 1.50 |
| 242 | 1.010 | 3.3 | 28.0 | 9.4 | 3.0 | 214.0 | 96.0 | 6.30 |
| 243 | 1.020 | 6.1 | 47.0 | 13.4 | 1.0 | 169.0 | 48.0 | 2.40 |
| 244 | 1.015 | 4.6 | 40.0 | 12.2 | 2.0 | 463.0 | 64.0 | 2.80 |
| 245 | NaN | 2.6 | 19.0 | 6.3 | NaN | 103.0 | 79.0 | 5.30 |
| 246 | 1.015 | 2.5 | 26.0 | 8.6 | 0.0 | 106.0 | 215.0 | 15.20 |
| 247 | 1.025 | NaN | NaN | NaN | 0.0 | 150.0 | 18.0 | 1.20 |
| 248 | 1.010 | 4.1 | 37.0 | 12.6 | 3.0 | 424.0 | 55.0 | 1.70 |
| 249 | 1.010 | 2.1 | 9.0 | 3.1 | 1.0 | 176.0 | 309.0 | 13.30 |
| 250 | 1.025 | 4.5 | 48.0 | 15.0 | 0.0 | 140.0 | 10.0 | 1.20 |
| 251 | 1.025 | 5.0 | 52.0 | 17.0 | 0.0 | 70.0 | 36.0 | 1.00 |
| 252 | 1.025 | 4.7 | 46.0 | 15.9 | 0.0 | 82.0 | 49.0 | 0.60 |
| 253 | 1.025 | 6.2 | 42.0 | 15.4 | 0.0 | 119.0 | 17.0 | 1.20 |
| 254 | 1.025 | 5.2 | 49.0 | 13.0 | 0.0 | 99.0 | 38.0 | 0.80 |
| 255 | 1.025 | 6.3 | 52.0 | 13.6 | 0.0 | 121.0 | 27.0 | 1.20 |
| 256 | 1.025 | 5.1 | 41.0 | 14.5 | 0.0 | 131.0 | 10.0 | 0.50 |
| 257 | 1.020 | 5.8 | 46.0 | 14.0 | 0.0 | 91.0 | 36.0 | 0.70 |
| 258 | 1.020 | 5.5 | 44.0 | 13.9 | 0.0 | 98.0 | 20.0 | 0.50 |
| 259 | 1.020 | 5.2 | 45.0 | 16.1 | 0.0 | 104.0 | 31.0 | 1.20 |
| 260 | 1.020 | 5.3 | 45.0 | 14.1 | 0.0 | 131.0 | 38.0 | 1.00 |
| 261 | 1.020 | 4.9 | 41.0 | 17.0 | 0.0 | 122.0 | 32.0 | 1.20 |
| 262 | 1.020 | 5.4 | 43.0 | 15.5 | 0.0 | 118.0 | 18.0 | 0.90 |
| 263 | 1.020 | 5.2 | 45.0 | 16.2 | 0.0 | 117.0 | 46.0 | 1.20 |
| 264 | 1.020 | 4.5 | 50.0 | 14.4 | 0.0 | 132.0 | 24.0 | 0.70 |
| 265 | 1.020 | 5.0 | 48.0 | 14.2 | 0.0 | 97.0 | 40.0 | 0.60 |
| 266 | 1.020 | 5.3 | 41.0 | 13.2 | 0.0 | 133.0 | 17.0 | 1.20 |
| 267 | 1.025 | 4.8 | 48.0 | 13.9 | 0.0 | 122.0 | 33.0 | 0.90 |
| 268 | NaN | 4.9 | 53.0 | 16.3 | NaN | 100.0 | 49.0 | 1.00 |
| 269 | 1.025 | 5.3 | 48.0 | 15.0 | 0.0 | 121.0 | 19.0 | 1.20 |
| 270 | 1.025 | 5.0 | 41.0 | 14.3 | 0.0 | 111.0 | 34.0 | 1.10 |
| 271 | 1.025 | 4.5 | 42.0 | 13.8 | 0.0 | 96.0 | 25.0 | 0.50 |
| 272 | 1.025 | 5.5 | 42.0 | 14.8 | 0.0 | 139.0 | 15.0 | 1.20 |
| 273 | 1.020 | NaN | NaN | NaN | 0.0 | 95.0 | 35.0 | 0.90 |
| 274 | 1.020 | NaN | 44.0 | 14.4 | 0.0 | 107.0 | 23.0 | 0.70 |
| 275 | 1.020 | 4.6 | 43.0 | 16.5 | 0.0 | 125.0 | 22.0 | 1.20 |
| 276 | 1.025 | 5.5 | 41.0 | 14.0 | 0.0 | NaN | NaN | NaN |
| 277 | 1.025 | 4.8 | 50.0 | 15.7 | 0.0 | 123.0 | 46.0 | 1.00 |
| 278 | 1.020 | 6.4 | 44.0 | 14.5 | 0.0 | 112.0 | 44.0 | 1.20 |
| 279 | 1.025 | 5.6 | 48.0 | 16.3 | 0.0 | 140.0 | 23.0 | 0.60 |
| 280 | NaN | 5.2 | 52.0 | 13.3 | NaN | 93.0 | 33.0 | 0.90 |
| 281 | 1.025 | 6.0 | 41.0 | 15.5 | 0.0 | 130.0 | 50.0 | 1.20 |
| 282 | 1.020 | 4.8 | 44.0 | 14.6 | 0.0 | 123.0 | 44.0 | 1.00 |
| 283 | 1.020 | 5.7 | 43.0 | 16.4 | 0.0 | NaN | NaN | NaN |
| 284 | 1.025 | 6.0 | 52.0 | 16.9 | 0.0 | 100.0 | 37.0 | 1.20 |
| 285 | 1.020 | 5.9 | 41.0 | 16.0 | 0.0 | 94.0 | 19.0 | 0.70 |
| 286 | 1.020 | 6.0 | 44.0 | 14.7 | 0.0 | 81.0 | 18.0 | 0.80 |
| 287 | 1.025 | NaN | 43.0 | 13.4 | 0.0 | 124.0 | 22.0 | 0.60 |
| 288 | 1.025 | 5.1 | 50.0 | 15.9 | 0.0 | 70.0 | 46.0 | 1.20 |
| 289 | 1.020 | 5.3 | 43.0 | 16.6 | 0.0 | 93.0 | 32.0 | 0.90 |
| 290 | 1.020 | 5.9 | 52.0 | 14.8 | 0.0 | 76.0 | 28.0 | 0.60 |
| 291 | 1.025 | 5.7 | 41.0 | 14.9 | 0.0 | 124.0 | 44.0 | 1.00 |
| 292 | 1.020 | 5.0 | 52.0 | 16.7 | 0.0 | 89.0 | 42.0 | 0.50 |
| 293 | 1.020 | 5.4 | 48.0 | 14.9 | 0.0 | 92.0 | 19.0 | 1.20 |
| 294 | 1.020 | 5.8 | 40.0 | 14.3 | 0.0 | 110.0 | 50.0 | 0.70 |
| 295 | NaN | 6.5 | 50.0 | 15.0 | NaN | 106.0 | 25.0 | 0.90 |
| 296 | 1.020 | 5.9 | 41.0 | 16.8 | 0.0 | 125.0 | 38.0 | 0.60 |
| 297 | 1.025 | 5.2 | 45.0 | 15.8 | 0.0 | 116.0 | 26.0 | 1.00 |
| 298 | 1.020 | 4.9 | 48.0 | 13.5 | 0.0 | 91.0 | 49.0 | 1.20 |
| 299 | 1.020 | 4.7 | 52.0 | 15.1 | 0.0 | 127.0 | 48.0 | 0.50 |
| 300 | 1.020 | 5.8 | 43.0 | 15.0 | 0.0 | 114.0 | 26.0 | 0.70 |
| 301 | 1.025 | 5.0 | 41.0 | 16.9 | 0.0 | 96.0 | 33.0 | 0.90 |
| 302 | 1.020 | NaN | 48.0 | 14.8 | 0.0 | 127.0 | 44.0 | 1.20 |
| 303 | 1.020 | 6.1 | 50.0 | 17.0 | 0.0 | 107.0 | 26.0 | 1.10 |
| 304 | 1.025 | 4.5 | 45.0 | 13.1 | 0.0 | 128.0 | 38.0 | 0.60 |
| 305 | 1.020 | 5.2 | 41.0 | 17.1 | 0.0 | 122.0 | 25.0 | 0.80 |
| 306 | 1.020 | 5.7 | 52.0 | 15.2 | 0.0 | 128.0 | 30.0 | 1.20 |
| 307 | 1.020 | 4.5 | 44.0 | 13.6 | 0.0 | 137.0 | 17.0 | 0.50 |
| 308 | 1.025 | 4.9 | 48.0 | 13.9 | 0.0 | 81.0 | 46.0 | 0.60 |
| 309 | 1.020 | 5.9 | 40.0 | 17.2 | 0.0 | 129.0 | 25.0 | 1.20 |
| 310 | 1.020 | 5.4 | 44.0 | 13.2 | 0.0 | 102.0 | 27.0 | 0.70 |
| 311 | 1.025 | 5.6 | 45.0 | 13.7 | 0.0 | 132.0 | 18.0 | 1.10 |
| 312 | 1.020 | 6.1 | 48.0 | 15.3 | 0.0 | NaN | NaN | NaN |
| 313 | 1.020 | 4.8 | 52.0 | 17.3 | 0.0 | 104.0 | 28.0 | 0.90 |
| 314 | 1.025 | 4.7 | 41.0 | 15.6 | 0.0 | 131.0 | 46.0 | 0.60 |
| 315 | 1.025 | 4.4 | 48.0 | 13.8 | 0.0 | NaN | NaN | NaN |
| 316 | 1.020 | 5.2 | 48.0 | 15.4 | 0.0 | 99.0 | 30.0 | 0.50 |
| 317 | 1.020 | 4.9 | 40.0 | 15.0 | 0.0 | 102.0 | 48.0 | 1.20 |
| 318 | 1.025 | 5.3 | 52.0 | 17.4 | 0.0 | 120.0 | 29.0 | 0.70 |
| 319 | 1.020 | NaN | NaN | NaN | 0.0 | 138.0 | 15.0 | 1.10 |
| 320 | 1.020 | 6.2 | 44.0 | 15.7 | 0.0 | 105.0 | 49.0 | 1.20 |
| 321 | 1.020 | 4.8 | 48.0 | 13.9 | 0.0 | 109.0 | 39.0 | 1.00 |
| 322 | NaN | 4.9 | 43.0 | 16.0 | NaN | 120.0 | 40.0 | 0.50 |
| 323 | 1.025 | 4.5 | 45.0 | 15.9 | 0.0 | 130.0 | 30.0 | 1.10 |
| 324 | 1.020 | NaN | NaN | NaN | 0.0 | 119.0 | 15.0 | 0.70 |
| 325 | 1.020 | 6.5 | 50.0 | 14.0 | 0.0 | 100.0 | 50.0 | 1.20 |
| 326 | 1.020 | 5.2 | 41.0 | 15.8 | 0.0 | 109.0 | 25.0 | 1.10 |
| 327 | 1.025 | 5.8 | 44.0 | 13.4 | 0.0 | 120.0 | 31.0 | 0.80 |
| 328 | 1.020 | 6.5 | 45.0 | NaN | 0.0 | 131.0 | 29.0 | 0.60 |
| 329 | 1.025 | 5.1 | 48.0 | 14.1 | 0.0 | 80.0 | 25.0 | 0.90 |
| 330 | 1.020 | NaN | 42.0 | NaN | 0.0 | 114.0 | 32.0 | 1.10 |
| 331 | 1.025 | 4.5 | 46.0 | 13.5 | 0.0 | 130.0 | 39.0 | 0.70 |
| 332 | 1.025 | 6.1 | 44.0 | 15.3 | 0.0 | NaN | 33.0 | 1.00 |
| 333 | 1.020 | 5.5 | 46.0 | 17.7 | 0.0 | 99.0 | 46.0 | 1.20 |
| 334 | 1.025 | 4.5 | 43.0 | 15.4 | 0.0 | 125.0 | NaN | NaN |
| 335 | 1.020 | 5.6 | 48.0 | 14.2 | 0.0 | 134.0 | 45.0 | 0.50 |
| 336 | 1.020 | 5.2 | 40.0 | 15.2 | 0.0 | 119.0 | 27.0 | 0.50 |
| 337 | 1.025 | 6.2 | 52.0 | 14.0 | 0.0 | 92.0 | 40.0 | 0.90 |
| 338 | 1.020 | 4.5 | 44.0 | 17.8 | 0.0 | 132.0 | 34.0 | 0.80 |
| 339 | 1.020 | 4.9 | 48.0 | 13.3 | 0.0 | 88.0 | 42.0 | 0.50 |
| 340 | 1.025 | 5.9 | 43.0 | 14.3 | 0.0 | 100.0 | 29.0 | 1.10 |
| 341 | 1.025 | 4.7 | 41.0 | 13.4 | 0.0 | 130.0 | 37.0 | 0.90 |
| 342 | 1.020 | 6.3 | 50.0 | 15.0 | 0.0 | 95.0 | 46.0 | 0.50 |
| 343 | 1.025 | 5.7 | 50.0 | 16.2 | 0.0 | 111.0 | 35.0 | 0.80 |
| 344 | 1.020 | 4.7 | 42.0 | 14.4 | 0.0 | 106.0 | 27.0 | 0.70 |
| 345 | 1.025 | 6.4 | 42.0 | 13.5 | 0.0 | 97.0 | 18.0 | 1.20 |
| 346 | NaN | 5.8 | 52.0 | 15.5 | NaN | 130.0 | 41.0 | 0.90 |
| 347 | 1.025 | 5.5 | 43.0 | 17.8 | 0.0 | 108.0 | 25.0 | 1.00 |
| 348 | 1.020 | 6.4 | 44.0 | 13.6 | 0.0 | 99.0 | 19.0 | 0.50 |
| 349 | 1.025 | 6.1 | 52.0 | 14.5 | 0.0 | 82.0 | 36.0 | 1.10 |
| 350 | 1.025 | 4.5 | 43.0 | 16.1 | 0.0 | 85.0 | 20.0 | 1.00 |
| 351 | 1.020 | 4.7 | 40.0 | 17.5 | 0.0 | 83.0 | 49.0 | 0.90 |
| 352 | 1.020 | 5.2 | 48.0 | 15.0 | 0.0 | 109.0 | 47.0 | 1.10 |
| 353 | 1.020 | 4.5 | 51.0 | 13.6 | 0.0 | 86.0 | 37.0 | 0.60 |
| 354 | 1.025 | 5.1 | 41.0 | 14.6 | 0.0 | 102.0 | 17.0 | 0.40 |
| 355 | 1.020 | 4.6 | 52.0 | 15.0 | 0.0 | 95.0 | 24.0 | 0.80 |
| 356 | 1.025 | 6.1 | 47.0 | 17.1 | 0.0 | 87.0 | 38.0 | 0.50 |
| 357 | 1.025 | 4.9 | 42.0 | 13.6 | 0.0 | 107.0 | 16.0 | 1.10 |
| 358 | 1.020 | 5.6 | 45.0 | 13.0 | 0.0 | 117.0 | 22.0 | 1.20 |
| 359 | 1.020 | 4.5 | 53.0 | 17.2 | 0.0 | 88.0 | 50.0 | 0.60 |
| 360 | 1.025 | 6.2 | 43.0 | 14.7 | 0.0 | 105.0 | 39.0 | 0.50 |
| 361 | 1.020 | 5.8 | 54.0 | 13.7 | 0.0 | 70.0 | 16.0 | 0.70 |
| 362 | 1.025 | 4.8 | 40.0 | 15.0 | 0.0 | 89.0 | 19.0 | 1.10 |
| 363 | 1.025 | 5.2 | 44.0 | 17.8 | 0.0 | 99.0 | 40.0 | 0.50 |
| 364 | 1.025 | 4.7 | 45.0 | 14.8 | 0.0 | 118.0 | 44.0 | 0.70 |
| 365 | 1.020 | 6.3 | NaN | NaN | 0.0 | 93.0 | 46.0 | 1.00 |
| 366 | 1.025 | 5.3 | 46.0 | 15.0 | 0.0 | 81.0 | 15.0 | 0.50 |
| 367 | 1.025 | 6.1 | 50.0 | 17.4 | 0.0 | 125.0 | 41.0 | 1.10 |
| 368 | 1.025 | 5.9 | 45.0 | 14.9 | 0.0 | 82.0 | 42.0 | 0.70 |
| 369 | 1.020 | 4.8 | 46.0 | 13.6 | 0.0 | 107.0 | 48.0 | 0.80 |
| 370 | 1.020 | 5.4 | 50.0 | 16.2 | 0.0 | 83.0 | 42.0 | 1.20 |
| 371 | 1.025 | 5.0 | 51.0 | 17.6 | 0.0 | 79.0 | 50.0 | 0.50 |
| 372 | 1.020 | 5.5 | 52.0 | 15.0 | 0.0 | 109.0 | 26.0 | 0.90 |
| 373 | 1.025 | 4.9 | 47.0 | 13.7 | 0.0 | 133.0 | 38.0 | 1.00 |
| 374 | 1.025 | 6.4 | 40.0 | 16.3 | 0.0 | 111.0 | 44.0 | 1.20 |
| 375 | 1.020 | 5.6 | 48.0 | 15.1 | 0.0 | 74.0 | 41.0 | 0.50 |
| 376 | 1.025 | 5.2 | 53.0 | 16.4 | 0.0 | 88.0 | 16.0 | 1.10 |
| 377 | 1.020 | 4.8 | 49.0 | 13.8 | 0.0 | 97.0 | 27.0 | 0.70 |
| 378 | 1.025 | 5.5 | 42.0 | 15.2 | 0.0 | NaN | NaN | 0.90 |
| 379 | 1.025 | 5.7 | 50.0 | 16.1 | 0.0 | 78.0 | 45.0 | 0.60 |
| 380 | 1.020 | 4.9 | 54.0 | 15.3 | 0.0 | 113.0 | 23.0 | 1.10 |
| 381 | 1.025 | 5.9 | 40.0 | 16.6 | 0.0 | 79.0 | 47.0 | 0.50 |
| 382 | 1.025 | 6.5 | 51.0 | 16.8 | 0.0 | 75.0 | 22.0 | 0.80 |
| 383 | 1.025 | 5.0 | 49.0 | 13.9 | 0.0 | 119.0 | 46.0 | 0.70 |
| 384 | 1.020 | 4.5 | 42.0 | 15.4 | 0.0 | 132.0 | 18.0 | 1.10 |
| 385 | 1.020 | 5.1 | 52.0 | 16.5 | 0.0 | 113.0 | 25.0 | 0.60 |
| 386 | 1.025 | 6.5 | 43.0 | 16.4 | 0.0 | 100.0 | 47.0 | 0.50 |
| 387 | 1.025 | 5.2 | 50.0 | 16.7 | 0.0 | 93.0 | 17.0 | 0.90 |
| 388 | 1.020 | 6.4 | 46.0 | 15.5 | 0.0 | 94.0 | 15.0 | 1.20 |
| 389 | 1.025 | 5.8 | 52.0 | 17.0 | 0.0 | 112.0 | 48.0 | 0.70 |
| 390 | 1.025 | 5.3 | 52.0 | 15.0 | 0.0 | 99.0 | 25.0 | 0.80 |
| 391 | 1.025 | 6.3 | 44.0 | 15.6 | 0.0 | 85.0 | 16.0 | 1.10 |
| 392 | 1.020 | 5.5 | 46.0 | 14.8 | 0.0 | 133.0 | 48.0 | 1.20 |
| 393 | 1.025 | 5.4 | 54.0 | 13.0 | 0.0 | 117.0 | 45.0 | 0.70 |
| 394 | 1.020 | 4.6 | 45.0 | 14.1 | 0.0 | 137.0 | 46.0 | 0.80 |
| 395 | 1.020 | 4.9 | 47.0 | 15.7 | 0.0 | 140.0 | 49.0 | 0.50 |
| 396 | 1.025 | 6.2 | 54.0 | 16.5 | 0.0 | 75.0 | 31.0 | 1.20 |
| 397 | 1.020 | 5.4 | 49.0 | 15.8 | 0.0 | 100.0 | 26.0 | 0.60 |
| 398 | 1.025 | 5.9 | 51.0 | 14.2 | 0.0 | 114.0 | 50.0 | 1.00 |
| 399 | 1.025 | 6.1 | 53.0 | 15.8 | 0.0 | 131.0 | 18.0 | 1.10 |
pos_features[pos_features.columns]
| specific_gravity | red_blood_cell_count | packed_cell_volume | haemoglobin | sugar | blood_glucose_random | blood_urea | serum_creatinine | |
|---|---|---|---|---|---|---|---|---|
| 0 | 1.020 | 5.2 | 44.0 | 15.4 | 0.0 | 121.0 | 36.0 | 1.20 |
| 1 | 1.020 | NaN | 38.0 | 11.3 | 0.0 | NaN | 18.0 | 0.80 |
| 2 | 1.010 | NaN | 31.0 | 9.6 | 3.0 | 423.0 | 53.0 | 1.80 |
| 3 | 1.005 | 3.9 | 32.0 | 11.2 | 0.0 | 117.0 | 56.0 | 3.80 |
| 4 | 1.010 | 4.6 | 35.0 | 11.6 | 0.0 | 106.0 | 26.0 | 1.40 |
| 5 | 1.015 | 4.4 | 39.0 | 12.2 | 0.0 | 74.0 | 25.0 | 1.10 |
| 6 | 1.010 | NaN | 36.0 | 12.4 | 0.0 | 100.0 | 54.0 | 24.00 |
| 7 | 1.015 | 5.0 | 44.0 | 12.4 | 4.0 | 410.0 | 31.0 | 1.10 |
| 8 | 1.015 | 4.0 | 33.0 | 10.8 | 0.0 | 138.0 | 60.0 | 1.90 |
| 9 | 1.020 | 3.7 | 29.0 | 9.5 | 0.0 | 70.0 | 107.0 | 7.20 |
| 10 | 1.010 | NaN | 28.0 | 9.4 | 4.0 | 490.0 | 55.0 | 4.00 |
| 11 | 1.010 | 3.8 | 32.0 | 10.8 | 0.0 | 380.0 | 60.0 | 2.70 |
| 12 | 1.015 | 3.4 | 28.0 | 9.7 | 1.0 | 208.0 | 72.0 | 2.10 |
| 13 | NaN | NaN | NaN | 9.8 | NaN | 98.0 | 86.0 | 4.60 |
| 14 | 1.010 | 2.6 | 16.0 | 5.6 | 2.0 | 157.0 | 90.0 | 4.10 |
| 15 | 1.015 | 2.8 | 24.0 | 7.6 | 0.0 | 76.0 | 162.0 | 9.60 |
| 16 | 1.015 | NaN | NaN | 12.6 | 0.0 | 99.0 | 46.0 | 2.20 |
| 17 | NaN | NaN | NaN | 12.1 | NaN | 114.0 | 87.0 | 5.20 |
| 18 | 1.025 | 4.3 | 37.0 | 12.7 | 3.0 | 263.0 | 27.0 | 1.30 |
| 19 | 1.015 | 3.7 | 30.0 | 10.3 | 0.0 | 100.0 | 31.0 | 1.60 |
| 20 | 1.015 | 3.2 | 24.0 | 7.7 | 0.0 | 173.0 | 148.0 | 3.90 |
| 21 | NaN | 3.6 | 32.0 | 10.9 | NaN | NaN | 180.0 | 76.00 |
| 22 | 1.025 | 3.4 | 32.0 | 9.8 | 0.0 | 95.0 | 163.0 | 7.70 |
| 23 | 1.010 | NaN | NaN | NaN | 0.0 | NaN | NaN | NaN |
| 24 | 1.015 | 4.6 | 39.0 | 11.1 | 0.0 | NaN | 50.0 | 1.40 |
| 25 | 1.025 | 3.7 | 29.0 | 9.9 | 0.0 | 108.0 | 75.0 | 1.90 |
| 26 | 1.015 | 4.0 | 35.0 | 11.6 | 0.0 | 156.0 | 45.0 | 2.40 |
| 27 | 1.010 | 4.1 | 37.0 | 12.5 | 4.0 | 264.0 | 87.0 | 2.70 |
| 28 | NaN | NaN | NaN | NaN | 3.0 | 123.0 | 31.0 | 1.40 |
| 29 | 1.005 | NaN | 38.0 | 12.9 | 0.0 | NaN | 28.0 | 1.40 |
| 30 | NaN | NaN | NaN | NaN | NaN | 93.0 | 155.0 | 7.30 |
| 31 | 1.015 | 4.0 | 30.0 | 10.1 | 0.0 | 107.0 | 33.0 | 1.50 |
| 32 | 1.010 | 4.0 | 34.0 | 11.3 | 1.0 | 159.0 | 39.0 | 1.50 |
| 33 | 1.020 | NaN | 29.0 | 10.1 | 0.0 | 140.0 | 55.0 | 2.50 |
| 34 | 1.010 | NaN | NaN | NaN | 0.0 | 171.0 | 153.0 | 5.20 |
| 35 | 1.020 | 4.9 | 36.0 | 12.0 | 1.0 | 270.0 | 39.0 | 2.00 |
| 36 | 1.015 | NaN | 32.0 | 10.3 | 0.0 | 92.0 | 29.0 | 1.80 |
| 37 | NaN | 2.5 | 28.0 | 9.7 | NaN | 137.0 | 65.0 | 3.40 |
| 38 | 1.020 | NaN | NaN | 12.5 | 0.0 | NaN | 103.0 | 4.10 |
| 39 | 1.010 | 4.2 | 40.0 | 13.0 | 2.0 | 140.0 | 70.0 | 3.40 |
| 40 | 1.010 | 4.1 | 32.0 | 11.1 | 0.0 | 99.0 | 80.0 | 2.10 |
| 41 | 1.010 | NaN | NaN | NaN | 0.0 | NaN | 20.0 | 0.70 |
| 42 | 1.010 | 4.5 | 33.0 | 9.7 | 0.0 | 204.0 | 29.0 | 1.00 |
| 43 | 1.010 | 3.1 | 24.0 | 7.9 | 0.0 | 79.0 | 202.0 | 10.80 |
| 44 | 1.010 | NaN | 28.0 | 9.7 | 0.0 | 207.0 | 77.0 | 6.30 |
| 45 | 1.020 | NaN | NaN | 9.3 | 0.0 | 208.0 | 89.0 | 5.90 |
| 46 | 1.015 | 4.7 | 37.0 | 12.4 | 0.0 | 124.0 | 24.0 | 1.20 |
| 47 | 1.010 | NaN | 45.0 | 15.0 | 0.0 | NaN | 17.0 | 0.80 |
| 48 | 1.005 | 3.5 | 29.0 | 10.0 | 0.0 | 70.0 | 32.0 | 0.90 |
| 49 | 1.010 | 3.5 | 29.0 | 9.7 | 0.0 | 144.0 | 72.0 | 3.00 |
| 50 | NaN | 3.8 | 28.0 | 8.6 | NaN | 91.0 | 114.0 | 3.25 |
| 51 | 1.015 | NaN | 33.0 | 10.3 | 0.0 | 162.0 | 66.0 | 1.60 |
| 52 | 1.015 | 3.7 | 34.0 | 10.9 | 0.0 | NaN | 38.0 | 2.20 |
| 53 | 1.015 | 4.7 | 40.0 | 13.6 | 5.0 | 246.0 | 24.0 | 1.00 |
| 54 | 1.010 | 4.2 | 40.0 | 13.0 | 2.0 | NaN | NaN | 3.40 |
| 55 | 1.005 | NaN | 28.0 | 9.5 | 0.0 | NaN | NaN | NaN |
| 56 | 1.015 | 3.4 | 30.0 | 10.2 | 4.0 | NaN | 164.0 | 9.70 |
| 57 | NaN | NaN | NaN | NaN | NaN | 93.0 | 155.0 | 7.30 |
| 58 | 1.020 | 4.3 | 33.0 | 10.5 | 0.0 | 253.0 | 142.0 | 4.60 |
| 59 | NaN | NaN | NaN | 6.6 | NaN | NaN | 96.0 | 6.40 |
| 60 | 1.020 | NaN | NaN | NaN | 0.0 | 141.0 | 66.0 | 3.20 |
| 61 | 1.010 | NaN | NaN | NaN | 3.0 | 182.0 | 391.0 | 32.00 |
| 62 | 1.020 | 3.8 | 33.0 | 11.0 | 0.0 | 86.0 | 15.0 | 0.60 |
| 63 | 1.015 | NaN | 27.0 | 7.5 | 0.0 | 150.0 | 111.0 | 6.10 |
| 64 | 1.010 | NaN | NaN | 9.8 | 0.0 | 146.0 | NaN | NaN |
| 65 | 1.010 | NaN | 48.0 | 15.0 | 0.0 | NaN | 20.0 | 1.10 |
| 66 | 1.020 | NaN | NaN | NaN | 0.0 | 150.0 | 55.0 | 1.60 |
| 67 | 1.020 | NaN | NaN | NaN | 0.0 | 425.0 | NaN | NaN |
| 68 | 1.010 | NaN | 37.0 | 10.9 | 0.0 | 112.0 | 73.0 | 3.30 |
| 69 | 1.015 | 6.0 | 52.0 | 15.6 | 4.0 | 250.0 | 20.0 | 1.10 |
| 70 | 1.015 | 5.2 | 44.0 | 15.2 | 4.0 | 360.0 | 19.0 | 0.70 |
| 71 | 1.010 | 3.2 | 28.0 | 9.8 | 0.0 | 163.0 | 92.0 | 3.30 |
| 72 | 1.010 | NaN | NaN | 10.3 | 3.0 | NaN | 35.0 | 1.30 |
| 73 | 1.015 | NaN | 14.0 | 4.8 | 0.0 | 129.0 | 107.0 | 6.70 |
| 74 | 1.015 | 3.4 | 29.0 | 9.1 | 0.0 | 129.0 | 107.0 | 6.70 |
| 75 | 1.015 | NaN | NaN | 8.1 | 0.0 | NaN | 16.0 | 0.70 |
| 76 | 1.005 | 4.0 | 36.0 | 10.3 | 0.0 | 133.0 | 139.0 | 8.50 |
| 77 | 1.010 | 3.7 | 34.0 | 11.9 | 0.0 | 102.0 | 48.0 | 3.20 |
| 78 | NaN | NaN | 30.0 | 10.1 | NaN | 158.0 | 85.0 | 3.20 |
| 79 | 1.010 | 5.0 | 40.0 | 13.5 | 0.0 | 165.0 | 55.0 | 1.80 |
| 80 | 1.010 | 3.8 | 31.0 | 10.8 | 0.0 | 132.0 | 98.0 | 2.80 |
| 81 | NaN | 3.7 | 29.0 | 8.3 | NaN | 360.0 | 45.0 | 2.40 |
| 82 | NaN | NaN | NaN | NaN | NaN | 104.0 | 77.0 | 1.90 |
| 83 | 1.015 | NaN | NaN | NaN | 0.0 | 127.0 | 19.0 | 1.00 |
| 84 | 1.010 | 2.1 | 22.0 | 7.1 | 0.0 | 76.0 | 186.0 | 15.00 |
| 85 | 1.015 | NaN | NaN | 9.9 | NaN | NaN | 46.0 | 1.50 |
| 86 | NaN | NaN | NaN | NaN | NaN | 415.0 | 37.0 | 1.90 |
| 87 | 1.005 | 5.0 | 32.0 | 11.1 | 0.0 | 169.0 | 47.0 | 2.90 |
| 88 | 1.010 | 4.7 | NaN | NaN | 0.0 | 251.0 | 52.0 | 2.20 |
| 89 | 1.020 | NaN | NaN | NaN | 0.0 | 109.0 | 32.0 | 1.40 |
| 90 | 1.010 | 4.2 | 40.0 | 13.0 | 2.0 | 280.0 | 35.0 | 3.20 |
| 91 | 1.015 | 5.6 | 52.0 | 16.1 | 1.0 | 210.0 | 26.0 | 1.70 |
| 92 | 1.010 | 3.6 | 33.0 | 10.4 | 0.0 | 219.0 | 82.0 | 3.60 |
| 93 | 1.010 | 3.2 | 30.0 | 9.2 | 2.0 | 295.0 | 90.0 | 5.60 |
| 94 | 1.010 | 3.9 | 36.0 | 11.6 | 0.0 | 93.0 | 66.0 | 1.60 |
| 95 | 1.015 | NaN | NaN | NaN | 0.0 | 94.0 | 25.0 | 1.10 |
| 96 | 1.010 | NaN | 36.0 | 11.2 | 1.0 | 172.0 | 32.0 | 2.70 |
| 97 | 1.015 | 4.0 | 32.0 | 10.0 | 0.0 | 91.0 | 51.0 | 2.20 |
| 98 | NaN | 2.3 | 18.0 | 6.2 | NaN | 101.0 | 106.0 | 6.50 |
| 99 | NaN | 4.2 | 32.0 | 11.2 | 4.0 | 298.0 | 24.0 | 1.20 |
| 100 | 1.015 | NaN | NaN | NaN | 0.0 | 153.0 | 22.0 | 0.90 |
| 101 | 1.015 | 3.9 | 33.0 | 11.3 | 0.0 | 88.0 | 80.0 | 4.40 |
| 102 | 1.010 | NaN | 52.0 | 13.9 | 0.0 | 92.0 | 32.0 | 2.10 |
| 103 | 1.015 | 4.2 | 36.0 | 10.2 | 0.0 | 226.0 | 217.0 | 10.20 |
| 104 | NaN | NaN | NaN | NaN | NaN | 143.0 | 88.0 | 2.00 |
| 105 | 1.015 | 5.2 | 42.0 | 14.1 | 0.0 | 115.0 | 32.0 | 11.50 |
| 106 | NaN | NaN | 17.0 | 6.0 | NaN | 89.0 | 118.0 | 6.10 |
| 107 | 1.015 | 4.4 | 34.0 | 11.2 | 4.0 | 297.0 | 53.0 | 2.80 |
| 108 | 1.015 | 4.2 | 37.0 | 11.8 | 0.0 | 107.0 | 15.0 | 1.00 |
| 109 | NaN | NaN | NaN | 11.7 | NaN | 233.0 | 50.1 | 1.90 |
| 110 | 1.015 | 4.7 | 34.0 | 11.7 | 0.0 | 123.0 | 19.0 | 2.00 |
| 111 | 1.010 | 3.9 | 32.0 | 10.0 | 3.0 | 294.0 | 71.0 | 4.40 |
| 112 | 1.015 | NaN | 33.0 | 10.8 | 0.0 | NaN | 34.0 | 1.20 |
| 113 | 1.015 | NaN | NaN | NaN | 2.0 | NaN | NaN | NaN |
| 114 | 1.015 | NaN | NaN | 12.1 | 0.0 | NaN | 51.0 | 1.80 |
| 115 | 1.010 | 4.3 | 44.0 | 12.4 | 0.0 | NaN | 28.0 | 0.90 |
| 116 | 1.015 | NaN | NaN | NaN | 0.0 | 104.0 | 16.0 | 0.50 |
| 117 | 1.020 | 4.4 | 37.0 | 12.5 | 0.0 | 219.0 | 36.0 | 1.30 |
| 118 | 1.010 | NaN | NaN | 11.4 | 0.0 | 99.0 | 25.0 | 1.20 |
| 119 | 1.010 | NaN | NaN | NaN | 0.0 | 140.0 | 27.0 | 1.20 |
| 120 | 1.025 | NaN | NaN | 12.6 | 3.0 | 323.0 | 40.0 | 2.20 |
| 121 | NaN | NaN | 46.0 | 15.0 | NaN | 125.0 | 21.0 | 1.30 |
| 122 | NaN | NaN | NaN | 6.0 | NaN | NaN | 219.0 | 12.20 |
| 123 | 1.015 | NaN | 42.0 | 14.0 | 3.0 | NaN | 30.0 | 1.10 |
| 124 | 1.015 | 3.6 | 28.0 | 9.1 | 0.0 | 90.0 | 98.0 | 2.50 |
| 125 | NaN | NaN | NaN | NaN | NaN | 308.0 | 36.0 | 2.50 |
| 126 | 1.015 | 4.5 | 37.0 | 12.0 | 0.0 | 144.0 | 125.0 | 4.00 |
| 127 | 1.015 | 4.3 | 35.0 | 11.4 | 0.0 | 118.0 | 125.0 | 5.30 |
| 128 | 1.015 | 2.9 | 23.0 | 8.1 | 3.0 | 224.0 | 166.0 | 5.60 |
| 129 | 1.025 | NaN | NaN | 11.1 | 0.0 | 158.0 | 49.0 | 1.40 |
| 130 | 1.010 | 2.7 | 22.0 | 8.2 | 0.0 | 128.0 | 208.0 | 9.20 |
| 131 | 1.010 | NaN | 36.0 | 11.8 | 0.0 | NaN | 25.0 | 0.60 |
| 132 | NaN | 2.7 | 24.0 | 8.6 | NaN | 219.0 | 176.0 | 13.80 |
| 133 | 1.015 | 8.0 | 37.0 | 12.0 | 0.0 | 118.0 | 125.0 | 5.30 |
| 134 | 1.010 | 3.8 | 33.0 | 10.8 | NaN | 122.0 | NaN | 16.90 |
| 135 | 1.015 | NaN | 39.0 | 13.2 | 2.0 | 214.0 | 24.0 | 1.30 |
| 136 | 1.020 | NaN | NaN | 9.3 | NaN | 213.0 | 68.0 | 2.80 |
| 137 | 1.010 | NaN | 29.0 | 10.0 | 0.0 | 268.0 | 86.0 | 4.00 |
| 138 | 1.010 | NaN | NaN | NaN | 0.0 | 95.0 | 51.0 | 1.60 |
| 139 | 1.015 | NaN | 33.0 | 11.1 | 0.0 | NaN | 68.0 | 2.80 |
| 140 | 1.010 | NaN | NaN | NaN | 4.0 | 256.0 | 40.0 | 1.20 |
| 141 | 1.010 | NaN | 19.0 | 6.1 | 0.0 | NaN | 106.0 | 6.00 |
| 142 | NaN | NaN | NaN | NaN | NaN | 84.0 | 145.0 | 7.10 |
| 143 | 1.015 | NaN | NaN | NaN | 4.0 | 210.0 | 165.0 | 18.00 |
| 144 | 1.010 | 4.1 | 33.0 | 11.1 | 0.0 | 105.0 | 53.0 | 2.30 |
| 145 | 1.015 | 3.3 | 24.0 | 8.0 | 0.0 | NaN | 322.0 | 13.00 |
| 146 | 1.010 | NaN | NaN | NaN | 3.0 | 213.0 | 23.0 | 1.00 |
| 147 | 1.010 | 3.0 | 25.0 | 7.9 | 1.0 | 288.0 | 36.0 | 1.70 |
| 148 | NaN | NaN | NaN | NaN | NaN | 171.0 | 26.0 | 48.10 |
| 149 | 1.020 | NaN | 32.0 | 10.5 | 0.0 | 139.0 | 29.0 | 1.00 |
| 150 | 1.025 | NaN | 41.0 | 12.3 | 0.0 | 78.0 | 27.0 | 0.90 |
| 151 | NaN | NaN | 30.0 | 9.6 | NaN | 172.0 | 46.0 | 1.70 |
| 152 | 1.010 | NaN | 32.0 | 10.9 | 0.0 | 121.0 | 20.0 | 0.80 |
| 153 | 1.010 | 2.9 | 22.0 | 8.3 | 1.0 | 273.0 | 235.0 | 14.20 |
| 154 | 1.005 | 3.0 | 26.0 | 8.4 | 3.0 | 242.0 | 132.0 | 16.40 |
| 155 | 1.020 | NaN | 36.0 | 11.1 | 0.0 | 123.0 | 40.0 | 1.80 |
| 156 | 1.015 | NaN | NaN | NaN | 0.0 | 153.0 | 76.0 | 3.30 |
| 157 | 1.025 | 3.9 | 39.0 | 12.6 | 0.0 | 122.0 | 42.0 | 1.70 |
| 158 | 1.020 | NaN | 31.0 | 10.9 | 2.0 | 424.0 | 48.0 | 1.50 |
| 159 | 1.010 | 4.3 | 35.0 | 10.4 | 0.0 | 303.0 | 35.0 | 1.30 |
| 160 | NaN | 2.4 | 35.0 | 10.9 | NaN | 148.0 | 39.0 | 2.10 |
| 161 | 1.015 | 4.8 | 42.0 | 14.3 | 0.0 | NaN | NaN | NaN |
| 162 | NaN | NaN | 37.0 | 9.8 | NaN | 204.0 | 34.0 | 1.50 |
| 163 | 1.010 | 3.2 | 27.0 | 9.0 | 0.0 | 160.0 | 40.0 | 2.00 |
| 164 | 1.015 | 5.4 | 40.0 | 14.3 | 0.0 | 192.0 | 15.0 | 0.80 |
| 165 | 1.020 | NaN | NaN | NaN | 2.0 | NaN | NaN | NaN |
| 166 | NaN | NaN | NaN | NaN | NaN | 76.0 | 44.0 | 3.90 |
| 167 | 1.020 | NaN | 42.0 | 12.7 | 0.0 | 139.0 | 19.0 | 0.90 |
| 168 | 1.015 | NaN | 39.0 | 11.0 | 4.0 | 307.0 | 28.0 | 1.50 |
| 169 | 1.010 | NaN | 27.0 | 8.7 | 2.0 | 220.0 | 68.0 | 2.80 |
| 170 | 1.015 | 4.4 | 33.0 | 12.5 | 5.0 | 447.0 | 41.0 | 1.70 |
| 171 | 1.020 | 3.1 | 26.0 | 8.7 | 0.0 | 102.0 | 60.0 | 2.60 |
| 172 | 1.010 | 4.9 | 34.0 | 10.6 | 2.0 | 309.0 | 113.0 | 2.90 |
| 173 | 1.015 | NaN | 41.0 | 13.1 | 0.0 | 22.0 | 1.5 | 7.30 |
| 174 | NaN | 4.6 | 35.0 | 11.0 | NaN | 111.0 | 146.0 | 7.50 |
| 175 | 1.010 | 3.4 | NaN | NaN | 0.0 | 261.0 | 58.0 | 2.20 |
| 176 | 1.010 | 3.9 | 23.0 | 8.3 | 0.0 | 107.0 | 40.0 | 1.70 |
| 177 | 1.015 | NaN | 41.0 | 13.2 | 1.0 | 215.0 | 133.0 | 2.50 |
| 178 | 1.020 | NaN | 34.0 | 9.8 | 0.0 | 93.0 | 153.0 | 2.70 |
| 179 | 1.010 | NaN | 39.0 | 11.9 | 0.0 | 124.0 | 53.0 | 2.30 |
| 180 | 1.010 | NaN | 28.0 | 10.3 | 4.0 | 234.0 | 56.0 | 1.90 |
| 181 | 1.025 | 3.7 | 30.0 | 10.0 | 0.0 | 117.0 | 52.0 | 2.20 |
| 182 | 1.020 | NaN | 35.0 | 11.3 | 0.0 | 131.0 | 23.0 | 0.80 |
| 183 | 1.015 | NaN | NaN | NaN | 0.0 | 101.0 | 106.0 | 6.50 |
| 184 | 1.015 | 3.6 | 31.0 | 11.3 | 2.0 | 352.0 | 137.0 | 3.30 |
| 185 | 1.020 | NaN | 34.0 | 12.0 | 0.0 | 99.0 | 23.0 | 0.60 |
| 186 | 1.020 | NaN | NaN | NaN | 0.0 | NaN | 46.0 | 1.00 |
| 187 | 1.010 | NaN | 34.0 | 10.7 | 0.0 | NaN | 22.0 | 0.70 |
| 188 | NaN | NaN | 38.0 | 12.2 | NaN | 80.0 | 66.0 | 2.50 |
| 189 | 1.010 | 3.4 | 29.0 | 9.5 | 1.0 | 239.0 | 58.0 | 4.30 |
| 190 | 1.010 | 4.8 | 30.0 | 9.9 | 0.0 | 94.0 | 67.0 | 1.00 |
| 191 | 1.010 | 3.4 | 26.0 | 9.1 | 0.0 | 110.0 | 115.0 | 6.00 |
| 192 | 1.015 | NaN | NaN | NaN | 0.0 | 130.0 | 16.0 | 0.90 |
| 193 | 1.025 | 2.8 | 15.0 | 5.5 | 0.0 | NaN | 223.0 | 18.10 |
| 194 | 1.010 | NaN | NaN | NaN | NaN | NaN | 49.0 | 1.20 |
| 195 | 1.020 | NaN | NaN | 5.8 | 1.0 | 184.0 | 98.6 | 3.30 |
| 196 | 1.010 | 3.5 | 24.0 | 8.1 | 0.0 | 129.0 | 158.0 | 11.80 |
| 197 | NaN | 3.0 | NaN | 6.8 | NaN | NaN | 111.0 | 9.30 |
| 198 | 1.020 | 3.9 | 30.0 | 11.2 | 2.0 | 252.0 | 40.0 | 3.20 |
| 199 | 1.015 | 3.2 | 25.0 | 8.8 | 0.0 | 92.0 | 37.0 | 1.50 |
| 200 | 1.025 | 3.9 | 37.0 | 12.0 | 0.0 | 139.0 | 89.0 | 3.00 |
| 201 | NaN | NaN | 21.0 | 7.9 | NaN | 113.0 | 94.0 | 7.30 |
| 202 | NaN | NaN | 24.0 | 8.0 | NaN | 114.0 | 74.0 | 2.90 |
| 203 | NaN | NaN | NaN | 8.5 | NaN | 207.0 | 80.0 | 6.80 |
| 204 | 1.010 | NaN | 31.0 | 8.8 | 2.0 | 172.0 | 82.0 | 13.50 |
| 205 | NaN | NaN | 43.0 | 12.6 | NaN | 100.0 | 28.0 | 2.10 |
| 206 | 1.010 | NaN | 41.0 | 13.8 | 0.0 | 109.0 | 96.0 | 3.90 |
| 207 | 1.010 | 4.6 | 41.0 | 12.0 | 0.0 | 230.0 | 50.0 | 2.20 |
| 208 | NaN | 4.9 | 41.0 | 12.3 | NaN | 341.0 | 37.0 | 1.50 |
| 209 | 1.020 | NaN | NaN | 11.5 | 0.0 | NaN | NaN | NaN |
| 210 | 1.015 | 3.9 | 20.0 | 7.3 | 2.0 | 255.0 | 132.0 | 12.80 |
| 211 | 1.015 | NaN | NaN | NaN | 0.0 | 103.0 | 18.0 | 1.20 |
| 212 | 1.015 | 3.4 | 31.0 | 10.9 | 4.0 | 253.0 | 150.0 | 11.90 |
| 213 | 1.010 | 3.7 | 34.0 | 10.9 | 1.0 | 214.0 | 73.0 | 3.90 |
| 214 | 1.015 | 5.2 | 43.0 | 13.7 | 0.0 | 171.0 | 30.0 | 1.00 |
| 215 | 1.010 | NaN | NaN | NaN | 0.0 | NaN | NaN | NaN |
| 216 | 1.010 | NaN | 38.0 | 12.8 | 0.0 | 107.0 | 15.0 | NaN |
| 217 | 1.010 | 4.3 | 36.0 | 12.2 | 0.0 | 78.0 | 61.0 | 1.80 |
| 218 | 1.015 | NaN | 34.0 | 11.8 | 0.0 | 92.0 | 19.0 | 0.80 |
| 219 | 1.010 | 3.3 | 28.0 | 9.8 | 0.0 | 238.0 | 57.0 | 2.50 |
| 220 | 1.010 | NaN | 36.0 | 11.9 | 0.0 | 103.0 | NaN | NaN |
| 221 | 1.020 | NaN | NaN | NaN | 0.0 | 248.0 | 30.0 | 1.70 |
| 222 | NaN | NaN | NaN | NaN | NaN | 108.0 | 68.0 | 1.80 |
| 223 | 1.010 | 4.6 | 38.0 | 13.0 | 3.0 | 303.0 | 30.0 | 1.30 |
| 224 | 1.020 | NaN | NaN | NaN | 0.0 | 117.0 | 28.0 | 2.20 |
| 225 | 1.010 | 4.5 | 35.0 | 11.5 | 5.0 | 490.0 | 95.0 | 2.70 |
| 226 | 1.015 | 3.4 | 26.0 | 7.9 | 2.0 | 163.0 | 54.0 | 7.20 |
| 227 | 1.015 | 3.8 | 36.0 | 11.3 | 0.0 | 120.0 | 48.0 | 1.60 |
| 228 | NaN | NaN | NaN | NaN | NaN | 124.0 | 52.0 | 2.50 |
| 229 | 1.010 | 3.8 | 31.0 | 9.6 | 0.0 | 241.0 | 191.0 | 12.00 |
| 230 | 1.010 | NaN | NaN | NaN | 0.0 | 192.0 | 17.0 | 1.70 |
| 231 | NaN | NaN | 35.0 | 11.5 | NaN | 269.0 | 51.0 | 2.80 |
| 232 | 1.015 | NaN | NaN | NaN | 0.0 | NaN | NaN | NaN |
| 233 | 1.015 | NaN | NaN | NaN | 0.0 | 93.0 | 20.0 | 1.60 |
| 234 | 1.010 | 5.2 | 44.0 | 15.0 | 0.0 | NaN | 19.0 | 1.30 |
| 235 | 1.010 | NaN | 26.0 | 7.9 | 0.0 | 113.0 | 93.0 | 2.30 |
| 236 | NaN | NaN | 25.0 | 9.1 | NaN | 74.0 | 66.0 | 2.00 |
| 237 | 1.015 | NaN | 40.0 | 12.7 | 2.0 | 141.0 | 53.0 | 2.20 |
| 238 | NaN | NaN | 28.0 | 9.4 | NaN | 201.0 | 241.0 | 13.40 |
| 239 | 1.015 | NaN | 39.0 | 11.9 | 0.0 | 104.0 | 50.0 | 1.60 |
| 240 | 1.015 | 4.1 | 36.0 | 11.4 | 0.0 | 203.0 | 46.0 | 1.40 |
| 241 | 1.015 | 3.9 | 31.0 | 10.4 | 0.0 | 165.0 | 45.0 | 1.50 |
| 242 | 1.010 | 3.3 | 28.0 | 9.4 | 3.0 | 214.0 | 96.0 | 6.30 |
| 243 | 1.020 | 6.1 | 47.0 | 13.4 | 1.0 | 169.0 | 48.0 | 2.40 |
| 244 | 1.015 | 4.6 | 40.0 | 12.2 | 2.0 | 463.0 | 64.0 | 2.80 |
| 245 | NaN | 2.6 | 19.0 | 6.3 | NaN | 103.0 | 79.0 | 5.30 |
| 246 | 1.015 | 2.5 | 26.0 | 8.6 | 0.0 | 106.0 | 215.0 | 15.20 |
| 247 | 1.025 | NaN | NaN | NaN | 0.0 | 150.0 | 18.0 | 1.20 |
| 248 | 1.010 | 4.1 | 37.0 | 12.6 | 3.0 | 424.0 | 55.0 | 1.70 |
| 249 | 1.010 | 2.1 | 9.0 | 3.1 | 1.0 | 176.0 | 309.0 | 13.30 |
| 250 | 1.025 | 4.5 | 48.0 | 15.0 | 0.0 | 140.0 | 10.0 | 1.20 |
| 251 | 1.025 | 5.0 | 52.0 | 17.0 | 0.0 | 70.0 | 36.0 | 1.00 |
| 252 | 1.025 | 4.7 | 46.0 | 15.9 | 0.0 | 82.0 | 49.0 | 0.60 |
| 253 | 1.025 | 6.2 | 42.0 | 15.4 | 0.0 | 119.0 | 17.0 | 1.20 |
| 254 | 1.025 | 5.2 | 49.0 | 13.0 | 0.0 | 99.0 | 38.0 | 0.80 |
| 255 | 1.025 | 6.3 | 52.0 | 13.6 | 0.0 | 121.0 | 27.0 | 1.20 |
| 256 | 1.025 | 5.1 | 41.0 | 14.5 | 0.0 | 131.0 | 10.0 | 0.50 |
| 257 | 1.020 | 5.8 | 46.0 | 14.0 | 0.0 | 91.0 | 36.0 | 0.70 |
| 258 | 1.020 | 5.5 | 44.0 | 13.9 | 0.0 | 98.0 | 20.0 | 0.50 |
| 259 | 1.020 | 5.2 | 45.0 | 16.1 | 0.0 | 104.0 | 31.0 | 1.20 |
| 260 | 1.020 | 5.3 | 45.0 | 14.1 | 0.0 | 131.0 | 38.0 | 1.00 |
| 261 | 1.020 | 4.9 | 41.0 | 17.0 | 0.0 | 122.0 | 32.0 | 1.20 |
| 262 | 1.020 | 5.4 | 43.0 | 15.5 | 0.0 | 118.0 | 18.0 | 0.90 |
| 263 | 1.020 | 5.2 | 45.0 | 16.2 | 0.0 | 117.0 | 46.0 | 1.20 |
| 264 | 1.020 | 4.5 | 50.0 | 14.4 | 0.0 | 132.0 | 24.0 | 0.70 |
| 265 | 1.020 | 5.0 | 48.0 | 14.2 | 0.0 | 97.0 | 40.0 | 0.60 |
| 266 | 1.020 | 5.3 | 41.0 | 13.2 | 0.0 | 133.0 | 17.0 | 1.20 |
| 267 | 1.025 | 4.8 | 48.0 | 13.9 | 0.0 | 122.0 | 33.0 | 0.90 |
| 268 | NaN | 4.9 | 53.0 | 16.3 | NaN | 100.0 | 49.0 | 1.00 |
| 269 | 1.025 | 5.3 | 48.0 | 15.0 | 0.0 | 121.0 | 19.0 | 1.20 |
| 270 | 1.025 | 5.0 | 41.0 | 14.3 | 0.0 | 111.0 | 34.0 | 1.10 |
| 271 | 1.025 | 4.5 | 42.0 | 13.8 | 0.0 | 96.0 | 25.0 | 0.50 |
| 272 | 1.025 | 5.5 | 42.0 | 14.8 | 0.0 | 139.0 | 15.0 | 1.20 |
| 273 | 1.020 | NaN | NaN | NaN | 0.0 | 95.0 | 35.0 | 0.90 |
| 274 | 1.020 | NaN | 44.0 | 14.4 | 0.0 | 107.0 | 23.0 | 0.70 |
| 275 | 1.020 | 4.6 | 43.0 | 16.5 | 0.0 | 125.0 | 22.0 | 1.20 |
| 276 | 1.025 | 5.5 | 41.0 | 14.0 | 0.0 | NaN | NaN | NaN |
| 277 | 1.025 | 4.8 | 50.0 | 15.7 | 0.0 | 123.0 | 46.0 | 1.00 |
| 278 | 1.020 | 6.4 | 44.0 | 14.5 | 0.0 | 112.0 | 44.0 | 1.20 |
| 279 | 1.025 | 5.6 | 48.0 | 16.3 | 0.0 | 140.0 | 23.0 | 0.60 |
| 280 | NaN | 5.2 | 52.0 | 13.3 | NaN | 93.0 | 33.0 | 0.90 |
| 281 | 1.025 | 6.0 | 41.0 | 15.5 | 0.0 | 130.0 | 50.0 | 1.20 |
| 282 | 1.020 | 4.8 | 44.0 | 14.6 | 0.0 | 123.0 | 44.0 | 1.00 |
| 283 | 1.020 | 5.7 | 43.0 | 16.4 | 0.0 | NaN | NaN | NaN |
| 284 | 1.025 | 6.0 | 52.0 | 16.9 | 0.0 | 100.0 | 37.0 | 1.20 |
| 285 | 1.020 | 5.9 | 41.0 | 16.0 | 0.0 | 94.0 | 19.0 | 0.70 |
| 286 | 1.020 | 6.0 | 44.0 | 14.7 | 0.0 | 81.0 | 18.0 | 0.80 |
| 287 | 1.025 | NaN | 43.0 | 13.4 | 0.0 | 124.0 | 22.0 | 0.60 |
| 288 | 1.025 | 5.1 | 50.0 | 15.9 | 0.0 | 70.0 | 46.0 | 1.20 |
| 289 | 1.020 | 5.3 | 43.0 | 16.6 | 0.0 | 93.0 | 32.0 | 0.90 |
| 290 | 1.020 | 5.9 | 52.0 | 14.8 | 0.0 | 76.0 | 28.0 | 0.60 |
| 291 | 1.025 | 5.7 | 41.0 | 14.9 | 0.0 | 124.0 | 44.0 | 1.00 |
| 292 | 1.020 | 5.0 | 52.0 | 16.7 | 0.0 | 89.0 | 42.0 | 0.50 |
| 293 | 1.020 | 5.4 | 48.0 | 14.9 | 0.0 | 92.0 | 19.0 | 1.20 |
| 294 | 1.020 | 5.8 | 40.0 | 14.3 | 0.0 | 110.0 | 50.0 | 0.70 |
| 295 | NaN | 6.5 | 50.0 | 15.0 | NaN | 106.0 | 25.0 | 0.90 |
| 296 | 1.020 | 5.9 | 41.0 | 16.8 | 0.0 | 125.0 | 38.0 | 0.60 |
| 297 | 1.025 | 5.2 | 45.0 | 15.8 | 0.0 | 116.0 | 26.0 | 1.00 |
| 298 | 1.020 | 4.9 | 48.0 | 13.5 | 0.0 | 91.0 | 49.0 | 1.20 |
| 299 | 1.020 | 4.7 | 52.0 | 15.1 | 0.0 | 127.0 | 48.0 | 0.50 |
| 300 | 1.020 | 5.8 | 43.0 | 15.0 | 0.0 | 114.0 | 26.0 | 0.70 |
| 301 | 1.025 | 5.0 | 41.0 | 16.9 | 0.0 | 96.0 | 33.0 | 0.90 |
| 302 | 1.020 | NaN | 48.0 | 14.8 | 0.0 | 127.0 | 44.0 | 1.20 |
| 303 | 1.020 | 6.1 | 50.0 | 17.0 | 0.0 | 107.0 | 26.0 | 1.10 |
| 304 | 1.025 | 4.5 | 45.0 | 13.1 | 0.0 | 128.0 | 38.0 | 0.60 |
| 305 | 1.020 | 5.2 | 41.0 | 17.1 | 0.0 | 122.0 | 25.0 | 0.80 |
| 306 | 1.020 | 5.7 | 52.0 | 15.2 | 0.0 | 128.0 | 30.0 | 1.20 |
| 307 | 1.020 | 4.5 | 44.0 | 13.6 | 0.0 | 137.0 | 17.0 | 0.50 |
| 308 | 1.025 | 4.9 | 48.0 | 13.9 | 0.0 | 81.0 | 46.0 | 0.60 |
| 309 | 1.020 | 5.9 | 40.0 | 17.2 | 0.0 | 129.0 | 25.0 | 1.20 |
| 310 | 1.020 | 5.4 | 44.0 | 13.2 | 0.0 | 102.0 | 27.0 | 0.70 |
| 311 | 1.025 | 5.6 | 45.0 | 13.7 | 0.0 | 132.0 | 18.0 | 1.10 |
| 312 | 1.020 | 6.1 | 48.0 | 15.3 | 0.0 | NaN | NaN | NaN |
| 313 | 1.020 | 4.8 | 52.0 | 17.3 | 0.0 | 104.0 | 28.0 | 0.90 |
| 314 | 1.025 | 4.7 | 41.0 | 15.6 | 0.0 | 131.0 | 46.0 | 0.60 |
| 315 | 1.025 | 4.4 | 48.0 | 13.8 | 0.0 | NaN | NaN | NaN |
| 316 | 1.020 | 5.2 | 48.0 | 15.4 | 0.0 | 99.0 | 30.0 | 0.50 |
| 317 | 1.020 | 4.9 | 40.0 | 15.0 | 0.0 | 102.0 | 48.0 | 1.20 |
| 318 | 1.025 | 5.3 | 52.0 | 17.4 | 0.0 | 120.0 | 29.0 | 0.70 |
| 319 | 1.020 | NaN | NaN | NaN | 0.0 | 138.0 | 15.0 | 1.10 |
| 320 | 1.020 | 6.2 | 44.0 | 15.7 | 0.0 | 105.0 | 49.0 | 1.20 |
| 321 | 1.020 | 4.8 | 48.0 | 13.9 | 0.0 | 109.0 | 39.0 | 1.00 |
| 322 | NaN | 4.9 | 43.0 | 16.0 | NaN | 120.0 | 40.0 | 0.50 |
| 323 | 1.025 | 4.5 | 45.0 | 15.9 | 0.0 | 130.0 | 30.0 | 1.10 |
| 324 | 1.020 | NaN | NaN | NaN | 0.0 | 119.0 | 15.0 | 0.70 |
| 325 | 1.020 | 6.5 | 50.0 | 14.0 | 0.0 | 100.0 | 50.0 | 1.20 |
| 326 | 1.020 | 5.2 | 41.0 | 15.8 | 0.0 | 109.0 | 25.0 | 1.10 |
| 327 | 1.025 | 5.8 | 44.0 | 13.4 | 0.0 | 120.0 | 31.0 | 0.80 |
| 328 | 1.020 | 6.5 | 45.0 | NaN | 0.0 | 131.0 | 29.0 | 0.60 |
| 329 | 1.025 | 5.1 | 48.0 | 14.1 | 0.0 | 80.0 | 25.0 | 0.90 |
| 330 | 1.020 | NaN | 42.0 | NaN | 0.0 | 114.0 | 32.0 | 1.10 |
| 331 | 1.025 | 4.5 | 46.0 | 13.5 | 0.0 | 130.0 | 39.0 | 0.70 |
| 332 | 1.025 | 6.1 | 44.0 | 15.3 | 0.0 | NaN | 33.0 | 1.00 |
| 333 | 1.020 | 5.5 | 46.0 | 17.7 | 0.0 | 99.0 | 46.0 | 1.20 |
| 334 | 1.025 | 4.5 | 43.0 | 15.4 | 0.0 | 125.0 | NaN | NaN |
| 335 | 1.020 | 5.6 | 48.0 | 14.2 | 0.0 | 134.0 | 45.0 | 0.50 |
| 336 | 1.020 | 5.2 | 40.0 | 15.2 | 0.0 | 119.0 | 27.0 | 0.50 |
| 337 | 1.025 | 6.2 | 52.0 | 14.0 | 0.0 | 92.0 | 40.0 | 0.90 |
| 338 | 1.020 | 4.5 | 44.0 | 17.8 | 0.0 | 132.0 | 34.0 | 0.80 |
| 339 | 1.020 | 4.9 | 48.0 | 13.3 | 0.0 | 88.0 | 42.0 | 0.50 |
| 340 | 1.025 | 5.9 | 43.0 | 14.3 | 0.0 | 100.0 | 29.0 | 1.10 |
| 341 | 1.025 | 4.7 | 41.0 | 13.4 | 0.0 | 130.0 | 37.0 | 0.90 |
| 342 | 1.020 | 6.3 | 50.0 | 15.0 | 0.0 | 95.0 | 46.0 | 0.50 |
| 343 | 1.025 | 5.7 | 50.0 | 16.2 | 0.0 | 111.0 | 35.0 | 0.80 |
| 344 | 1.020 | 4.7 | 42.0 | 14.4 | 0.0 | 106.0 | 27.0 | 0.70 |
| 345 | 1.025 | 6.4 | 42.0 | 13.5 | 0.0 | 97.0 | 18.0 | 1.20 |
| 346 | NaN | 5.8 | 52.0 | 15.5 | NaN | 130.0 | 41.0 | 0.90 |
| 347 | 1.025 | 5.5 | 43.0 | 17.8 | 0.0 | 108.0 | 25.0 | 1.00 |
| 348 | 1.020 | 6.4 | 44.0 | 13.6 | 0.0 | 99.0 | 19.0 | 0.50 |
| 349 | 1.025 | 6.1 | 52.0 | 14.5 | 0.0 | 82.0 | 36.0 | 1.10 |
| 350 | 1.025 | 4.5 | 43.0 | 16.1 | 0.0 | 85.0 | 20.0 | 1.00 |
| 351 | 1.020 | 4.7 | 40.0 | 17.5 | 0.0 | 83.0 | 49.0 | 0.90 |
| 352 | 1.020 | 5.2 | 48.0 | 15.0 | 0.0 | 109.0 | 47.0 | 1.10 |
| 353 | 1.020 | 4.5 | 51.0 | 13.6 | 0.0 | 86.0 | 37.0 | 0.60 |
| 354 | 1.025 | 5.1 | 41.0 | 14.6 | 0.0 | 102.0 | 17.0 | 0.40 |
| 355 | 1.020 | 4.6 | 52.0 | 15.0 | 0.0 | 95.0 | 24.0 | 0.80 |
| 356 | 1.025 | 6.1 | 47.0 | 17.1 | 0.0 | 87.0 | 38.0 | 0.50 |
| 357 | 1.025 | 4.9 | 42.0 | 13.6 | 0.0 | 107.0 | 16.0 | 1.10 |
| 358 | 1.020 | 5.6 | 45.0 | 13.0 | 0.0 | 117.0 | 22.0 | 1.20 |
| 359 | 1.020 | 4.5 | 53.0 | 17.2 | 0.0 | 88.0 | 50.0 | 0.60 |
| 360 | 1.025 | 6.2 | 43.0 | 14.7 | 0.0 | 105.0 | 39.0 | 0.50 |
| 361 | 1.020 | 5.8 | 54.0 | 13.7 | 0.0 | 70.0 | 16.0 | 0.70 |
| 362 | 1.025 | 4.8 | 40.0 | 15.0 | 0.0 | 89.0 | 19.0 | 1.10 |
| 363 | 1.025 | 5.2 | 44.0 | 17.8 | 0.0 | 99.0 | 40.0 | 0.50 |
| 364 | 1.025 | 4.7 | 45.0 | 14.8 | 0.0 | 118.0 | 44.0 | 0.70 |
| 365 | 1.020 | 6.3 | NaN | NaN | 0.0 | 93.0 | 46.0 | 1.00 |
| 366 | 1.025 | 5.3 | 46.0 | 15.0 | 0.0 | 81.0 | 15.0 | 0.50 |
| 367 | 1.025 | 6.1 | 50.0 | 17.4 | 0.0 | 125.0 | 41.0 | 1.10 |
| 368 | 1.025 | 5.9 | 45.0 | 14.9 | 0.0 | 82.0 | 42.0 | 0.70 |
| 369 | 1.020 | 4.8 | 46.0 | 13.6 | 0.0 | 107.0 | 48.0 | 0.80 |
| 370 | 1.020 | 5.4 | 50.0 | 16.2 | 0.0 | 83.0 | 42.0 | 1.20 |
| 371 | 1.025 | 5.0 | 51.0 | 17.6 | 0.0 | 79.0 | 50.0 | 0.50 |
| 372 | 1.020 | 5.5 | 52.0 | 15.0 | 0.0 | 109.0 | 26.0 | 0.90 |
| 373 | 1.025 | 4.9 | 47.0 | 13.7 | 0.0 | 133.0 | 38.0 | 1.00 |
| 374 | 1.025 | 6.4 | 40.0 | 16.3 | 0.0 | 111.0 | 44.0 | 1.20 |
| 375 | 1.020 | 5.6 | 48.0 | 15.1 | 0.0 | 74.0 | 41.0 | 0.50 |
| 376 | 1.025 | 5.2 | 53.0 | 16.4 | 0.0 | 88.0 | 16.0 | 1.10 |
| 377 | 1.020 | 4.8 | 49.0 | 13.8 | 0.0 | 97.0 | 27.0 | 0.70 |
| 378 | 1.025 | 5.5 | 42.0 | 15.2 | 0.0 | NaN | NaN | 0.90 |
| 379 | 1.025 | 5.7 | 50.0 | 16.1 | 0.0 | 78.0 | 45.0 | 0.60 |
| 380 | 1.020 | 4.9 | 54.0 | 15.3 | 0.0 | 113.0 | 23.0 | 1.10 |
| 381 | 1.025 | 5.9 | 40.0 | 16.6 | 0.0 | 79.0 | 47.0 | 0.50 |
| 382 | 1.025 | 6.5 | 51.0 | 16.8 | 0.0 | 75.0 | 22.0 | 0.80 |
| 383 | 1.025 | 5.0 | 49.0 | 13.9 | 0.0 | 119.0 | 46.0 | 0.70 |
| 384 | 1.020 | 4.5 | 42.0 | 15.4 | 0.0 | 132.0 | 18.0 | 1.10 |
| 385 | 1.020 | 5.1 | 52.0 | 16.5 | 0.0 | 113.0 | 25.0 | 0.60 |
| 386 | 1.025 | 6.5 | 43.0 | 16.4 | 0.0 | 100.0 | 47.0 | 0.50 |
| 387 | 1.025 | 5.2 | 50.0 | 16.7 | 0.0 | 93.0 | 17.0 | 0.90 |
| 388 | 1.020 | 6.4 | 46.0 | 15.5 | 0.0 | 94.0 | 15.0 | 1.20 |
| 389 | 1.025 | 5.8 | 52.0 | 17.0 | 0.0 | 112.0 | 48.0 | 0.70 |
| 390 | 1.025 | 5.3 | 52.0 | 15.0 | 0.0 | 99.0 | 25.0 | 0.80 |
| 391 | 1.025 | 6.3 | 44.0 | 15.6 | 0.0 | 85.0 | 16.0 | 1.10 |
| 392 | 1.020 | 5.5 | 46.0 | 14.8 | 0.0 | 133.0 | 48.0 | 1.20 |
| 393 | 1.025 | 5.4 | 54.0 | 13.0 | 0.0 | 117.0 | 45.0 | 0.70 |
| 394 | 1.020 | 4.6 | 45.0 | 14.1 | 0.0 | 137.0 | 46.0 | 0.80 |
| 395 | 1.020 | 4.9 | 47.0 | 15.7 | 0.0 | 140.0 | 49.0 | 0.50 |
| 396 | 1.025 | 6.2 | 54.0 | 16.5 | 0.0 | 75.0 | 31.0 | 1.20 |
| 397 | 1.020 | 5.4 | 49.0 | 15.8 | 0.0 | 100.0 | 26.0 | 0.60 |
| 398 | 1.025 | 5.9 | 51.0 | 14.2 | 0.0 | 114.0 | 50.0 | 1.00 |
| 399 | 1.025 | 6.1 | 53.0 | 15.8 | 0.0 | 131.0 | 18.0 | 1.10 |
for pos_feature in pos_features:
kde_plot(pos_feature)
fig,axes=plt.subplots(4,2, figsize=(20,20))
for i, ax in zip(range(8), axes.flat):
sns.kdeplot(pos_features.iloc[:,i], ax=ax, hue=df['class'])
plt.tight_layout()
plt.subplots_adjust(hspace=0.5)
plt.show()
import plotly.express as px
# Defining violin and scatter plot
def violin(col):
fig = px.violin(df, y=col, x="class", color="class", box=True)
return fig.show()
def scatters(col1,col2):
fig = px.scatter(df, x=col1, y=col2, color="class")
return fig.show()
for pos_feature in pos_features:
violin(pos_feature)
## look at scatters of some features for pos correlation
# fig,axes = plt.subplots(1,3,figsize=(10,5))
# axes[0].scatter(df['red_blood_cell_count'], df['packed_cell_volume'], c=df['class'])
# axes[0].set_xlabel('red_blood_cell_count')
# axes[0].set_ylabel('packed_cell_volume')
# axes[1].scatter(df['red_blood_cell_count'], df['haemoglobin'], c=df['class'])
# axes[1].set_xlabel('red_blood_cell_count')
# axes[1].set_ylabel('haemoglobin')
# axes[2].scatter(df['packed_cell_volume'], df['haemoglobin'], c=df['class'])
# axes[2].set_xlabel('packed_cell_volume')
# axes[2].set_ylabel('haemoglobin')
# fig.subplots_adjust(wspace=0.4)
# plt.show()
scatters('red_blood_cell_count', 'packed_cell_volume')
scatters('red_blood_cell_count', 'haemoglobin')
scatters('haemoglobin','packed_cell_volume')
pos_names = ['specific_gravity', 'red_blood_cell_count', 'packed_cell_volume', 'haemoglobin', 'sugar', 'blood_glucose_random',
'blood_urea', 'serum_creatinine']
df_pos_names = df[pos_names]
plt.figure(figsize=(10,10))
df_pos_names.plot(figsize=(20,10), kind='density', subplots=True, layout=(2,4), sharex=False)
plt.tight_layout()
<Figure size 720x720 with 0 Axes>
## Negative corr visualization
scatters('red_blood_cell_count','albumin')
scatters('packed_cell_volume','blood_urea')
fig = px.bar(df, x="specific_gravity", y="packed_cell_volume",
color='class', barmode='group',
height=400)
fig.show()
sns.pairplot(df, hue = 'class', palette = 'CMRmap')
<seaborn.axisgrid.PairGrid at 0x290a29430>
# Number of rows and columns in the plot
n_cols = 3
n_rows = math.ceil(len(numerical_columns)/n_cols)
# Check the distribution of y variable corresponding to every x variable
fig,ax = plt.subplots(nrows = n_rows, ncols = n_cols, figsize=(30,30))
row = 0
col = 0
for i in numerical_columns:
if col > 2:
row += 1
col = 0
axes = ax[row,col]
sns.boxplot(x = df[input_target_class], y = df[i],ax = axes)
col += 1
plt.tight_layout()
plt.title("Individual Features by Class")
plt.show()
Machine Learning works on the idea of garbage in - garbage out. If you feed in dirty data, the results won't be good. Hence it's very important to clean the data before training the model.
Sklearn algorithms need missing value imputation but XGBoost, LightGBM etc does not require missing value imputation
There are various ways to handle missing values. Some of the ways are:
Here you can decide how you want to handle the missing data
# Select how you wish to treat missing values according to the input provided
if input_treat_missing_value == 'drop':
# drop rows with missing values
df.dropna(inplace=True)
print(df.shape)
elif input_treat_missing_value == 'impute':
# Impute missing values
for col in numerical_columns:
df[col] = df[col].fillna(df[col].median())
for col in categorical_columns:
mode = df[col].mode()[0]
df[col] = df[col].fillna(mode)
elif input_treat_missing_value == 'ignore':
print("Ignore missing values")
Encoding is the process of converting data from one form to another. Most of the Machine learning algorithms can not handle categorical values unless we convert them to numerical values. Many algorithm’s performances vary based on how Categorical columns are encoded.
There are lot of ways in which you can encode the categorical variables. Some of those are:
df.columns.values
array(['age', 'blood_pressure', 'specific_gravity', 'albumin', 'sugar',
'red_blood_cells', 'pus_cell', 'pus_cell_clumps', 'bacteria',
'blood_glucose_random', 'blood_urea', 'serum_creatinine', 'sodium',
'potassium', 'haemoglobin', 'packed_cell_volume',
'white_blood_cell_count', 'red_blood_cell_count', 'ypertension',
'diabetes_mellitus', 'coronary_artery_disease', 'appetite',
'pedal_edema', 'anemia', 'class'], dtype=object)
# Select the encoding technique according to the input provided
if input_encoding == "LabelEncoder":
# Use LabelEncoder function from sklearn
le = LabelEncoder()
df[categorical_columns] = df[categorical_columns].apply(lambda col: le.fit_transform(col))
elif input_encoding == "OneHotEncoder":
# Use pandas get dummies function to one hot encode
df = pd.get_dummies(df, columns=categorical_columns)
elif input_encoding == "OrdinalEncoder":
# Use OrdinalEncoder function from sklearn
oe = OrdinalEncoder()
df[categorical_columns] = oe.fit_transform(df[categorical_columns])
elif input_encoding == "FrequencyEncoder":
# Frequency encode
for variable in categorical_columns:
# group by frequency
fq = df.groupby(variable).size()/len(df)
# mapping values to dataframe
df.loc[:, "{}".format(variable)] = df[variable].map(fq)
categorical_columns.remove(input_target_class)
categorical_columns
['bacteria', 'pus_cell_clumps', 'pedal_edema', 'coronary_artery_disease', 'pus_cell', 'red_blood_cells', 'ypertension', 'anemia', 'diabetes_mellitus', 'appetite']
In this section you will:
Split the X and y dataset
# Split the y variable series and x variables dataset
X = df.drop([input_target_class],axis=1)
y = df[input_target_class]
It is a technique to standardize the x variables (features) present in the data in a fixed range. It needs to be done before training the model.
But if you are using tree based models, you should not go for feature scaling
# Define the function to scale the data using StandardScaler()
def scale_data(data):
scaler = StandardScaler()
# transform data
scaled_data = scaler.fit_transform(data)
scaled_data = DataFrame(scaled_data)
scaled_data.columns = data.columns
return scaled_data
# Scale X dataset
scaled_X = scale_data(X)
scaled_X.head()
| age | blood_pressure | specific_gravity | albumin | sugar | red_blood_cells | pus_cell | pus_cell_clumps | bacteria | blood_glucose_random | blood_urea | serum_creatinine | sodium | potassium | haemoglobin | packed_cell_volume | white_blood_cell_count | red_blood_cell_count | ypertension | diabetes_mellitus | coronary_artery_disease | appetite | pedal_edema | anemia | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | -0.210031 | 0.254214 | 0.421486 | 0.076249 | -0.380269 | 0.36489 | 0.484322 | -0.342518 | -0.241249 | -0.320122 | -0.419451 | -0.319668 | 0.040104 | -0.062903 | 1.053226 | 0.603224 | -0.197314 | 0.550044 | 1.311903 | 1.385535 | -0.304789 | -0.507801 | -0.484322 | -0.420084 |
| 1 | -2.627234 | -1.972476 | 0.421486 | 2.363728 | -0.380269 | 0.36489 | 0.484322 | -0.342518 | -0.241249 | -0.320122 | -0.784315 | -0.390819 | 0.040104 | -0.062903 | -0.457965 | -0.132789 | -0.909782 | 0.074073 | -0.762252 | -0.721743 | -0.304789 | -0.507801 | -0.484322 | -0.420084 |
| 2 | 0.615355 | 0.254214 | -1.421074 | 0.838742 | 2.507853 | 0.36489 | 0.484322 | -0.342518 | -0.241249 | 3.697618 | -0.074858 | -0.212942 | 0.040104 | -0.062903 | -1.084556 | -0.991470 | -0.316059 | 0.074073 | -0.762252 | 1.385535 | -0.304789 | 1.969276 | -0.484322 | 2.380476 |
| 3 | -0.210031 | -0.488016 | -2.342354 | 2.363728 | -0.380269 | 0.36489 | -2.064742 | 2.919556 | -0.241249 | -0.373337 | -0.014047 | 0.142813 | -2.896333 | -0.737181 | -0.494823 | -0.868801 | -0.632711 | -0.996862 | 1.311903 | -0.721743 | -0.304789 | 1.969276 | 2.064742 | 2.380476 |
| 4 | -0.033163 | 0.254214 | -1.421074 | 0.838742 | -0.380269 | 0.36489 | 0.484322 | -0.342518 | -0.241249 | -0.519679 | -0.622154 | -0.284093 | 0.040104 | -0.062903 | -0.347390 | -0.500795 | -0.395222 | -0.163913 | -0.762252 | -0.721743 | -0.304789 | -0.507801 | -0.484322 | -0.420084 |
Split the dataset in training and test set
# Split the dataset into the training set and test set
X_train, X_test, y_train, y_test = train_test_split(scaled_X, y, test_size = 0.3, random_state = 0)
Train the model on training data
# Spot-Check Algorithms
models = []
models.append(( 'LR' , LogisticRegression()))
models.append(( 'LDA' , LinearDiscriminantAnalysis()))
models.append(( 'KNN' , KNeighborsClassifier()))
models.append(( 'CART' , DecisionTreeClassifier()))
models.append(('RFC', RandomForestClassifier()))
models.append(('XGB', XGBClassifier()))
models.append(( 'NB' , GaussianNB()))
models.append(('LGB', LGBMClassifier()))
# Training the model:
score_results = []
score_names = []
num_folds = 10
scorings = ['accuracy', 'roc_auc']
for name, model in models:
kfold = KFold(n_splits=num_folds)
for scoring in scorings:
cv_results = cross_val_score(model, X_train, y_train, cv=kfold, scoring=scoring)
msg = "%s,%s: %f (%f)" % (name, scoring, cv_results.mean(), cv_results.std())
print(msg)
score_names.append(name)
score_results.append(cv_results)
LR,accuracy: 1.000000 (0.000000)
LR,roc_auc: 1.000000 (0.000000)
LDA,accuracy: 0.953571 (0.027894)
LDA,roc_auc: 0.994787 (0.008848)
KNN,accuracy: 0.964286 (0.027664)
KNN,roc_auc: 0.999745 (0.000765)
CART,accuracy: 0.967857 (0.033693)
CART,roc_auc: 0.956657 (0.035971)
RFC,accuracy: 0.989286 (0.022868)
RFC,roc_auc: 0.999490 (0.001531)
[13:30:02] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [13:30:02] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [13:30:02] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [13:30:03] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [13:30:03] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [13:30:03] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [13:30:03] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [13:30:03] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [13:30:03] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [13:30:04] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. XGB,accuracy: 0.975000 (0.022868)
[13:30:04] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [13:30:04] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [13:30:04] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [13:30:04] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [13:30:04] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [13:30:05] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [13:30:05] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [13:30:05] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [13:30:05] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [13:30:05] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. XGB,roc_auc: 0.995695 (0.007977)
NB,accuracy: 0.960714 (0.037287)
NB,roc_auc: 0.967389 (0.032010)
LGB,accuracy: 0.978571 (0.017496)
LGB,roc_auc: 0.998905 (0.002196)
Get the predictions from the model on testing data
Get the evaluation metrics to evaluate the performance of model on testing data
# Define a function to compute various evaluation metrics
def compute_evaluation_metric(model, x_test, y_actual, y_predicted, y_predicted_prob):
print("\n Accuracy Score : \n ",accuracy_score(y_actual,y_predicted))
print("\n AUC Score : \n", roc_auc_score(y_actual, y_predicted_prob))
print("\n Confusion Matrix : \n ",confusion_matrix(y_actual, y_predicted))
print("\n Classification Report : \n",classification_report(y_actual, y_predicted))
print("\n ROC curve : \n")
sns.set_style("white")
plot_roc_curve(model, x_test, y_actual)
plt.show()
print("\n Visualize Confusion Matrix : \n ", plot_confusion_matrix(model,X_test,y_test,normalize='true'))
plt.show()
# Training the model and evaluate, prediction:
for name, model in models:
model.fit(X_train, y_train)
# Predict class for test dataset
y_pred = model.predict(X_test)
# Predict probability for test dataset
y_pred_prod = model.predict_proba(X_test)
y_pred_prod = [x[1] for x in y_pred_prod]
print(name,"Y predicted : ",y_pred)
print(name,"Y probability predicted : ",y_pred_prod[:5])
compute_evaluation_metric(model, X_test, y_test, y_pred, y_pred_prod)
LR Y predicted : [0 1 1 1 1 0 1 1 0 0 0 1 1 1 0 1 1 1 0 0 1 1 0 0 0 1 0 1 0 0 1 1 0 0 0 1 1
1 1 0 1 0 1 0 0 1 0 1 1 1 1 0 1 0 0 0 1 1 1 1 0 1 1 1 1 0 0 1 1 0 0 1 1 1
0 1 1 0 1 1 1 0 1 0 1 1 1 0 1 0 0 1 1 0 0 1 0 1 0 0 0 1 0 1 0 1 0 0 1 0 1
0 0 0 1 1 1 1 0 0 1 0 0 0 0 0 0 1 1 1 1 0 1 0 1 1 1 0 1 1 0 0 0 0 1 0 1 1
0 1]
LR Y probability predicted : [5.731991197774028e-07, 0.9991871569917525, 0.8291622892011992, 0.9640428923935979, 0.8482191812727654]
Accuracy Score :
0.9133333333333333
AUC Score :
0.9624822190611665
Confusion Matrix :
[[65 9]
[ 4 72]]
Classification Report :
precision recall f1-score support
0 0.94 0.88 0.91 74
1 0.89 0.95 0.92 76
accuracy 0.91 150
macro avg 0.92 0.91 0.91 150
weighted avg 0.92 0.91 0.91 150
ROC curve :
Visualize Confusion Matrix : <sklearn.metrics._plot.confusion_matrix.ConfusionMatrixDisplay object at 0x2ad999c70>
LDA Y predicted : [0 1 1 1 1 0 1 1 0 0 0 1 1 1 0 1 1 1 0 0 0 1 0 0 0 1 0 1 0 0 1 0 0 0 1 1 1
1 1 0 1 0 1 0 0 0 0 1 1 0 1 0 1 0 0 0 1 0 1 1 0 1 1 1 1 0 0 1 1 0 0 0 1 1
0 1 1 0 1 1 1 0 1 0 1 1 1 0 1 0 0 1 1 0 0 1 0 1 1 0 0 1 0 1 0 1 0 0 1 1 1
0 0 0 1 1 1 1 0 1 1 0 0 1 1 0 0 1 1 1 1 0 1 0 1 1 1 1 1 1 0 0 0 0 1 0 1 1
0 0]
LDA Y probability predicted : [0.007632777413255834, 0.997124192241562, 0.9953446830916425, 0.9953533415089371, 0.9953631521173604]
Accuracy Score :
0.9666666666666667
AUC Score :
0.9992887624466572
Confusion Matrix :
[[69 5]
[ 0 76]]
Classification Report :
precision recall f1-score support
0 1.00 0.93 0.97 74
1 0.94 1.00 0.97 76
accuracy 0.97 150
macro avg 0.97 0.97 0.97 150
weighted avg 0.97 0.97 0.97 150
ROC curve :
Visualize Confusion Matrix : <sklearn.metrics._plot.confusion_matrix.ConfusionMatrixDisplay object at 0x2ad5dc4f0>
KNN Y predicted : [0 1 1 1 1 0 0 1 1 0 1 1 1 1 0 1 1 1 0 1 0 1 0 1 0 1 1 1 0 0 1 0 1 1 0 1 1
1 1 0 1 1 1 0 1 1 1 1 1 1 1 0 1 1 0 0 1 1 0 1 0 1 1 1 1 1 0 1 1 0 0 1 1 1
1 1 0 1 1 1 1 1 1 0 0 1 1 0 1 1 1 1 1 0 0 1 0 1 0 0 0 1 0 1 0 1 1 0 1 1 1
0 0 1 1 1 1 1 0 0 1 0 1 1 1 1 0 1 1 1 1 0 1 0 1 1 1 1 1 1 0 1 0 1 1 1 1 1
1 1]
KNN Y probability predicted : [0.2, 0.8, 1.0, 0.8, 1.0]
Accuracy Score :
0.76
AUC Score :
0.8386379800853485
Confusion Matrix :
[[42 32]
[ 4 72]]
Classification Report :
precision recall f1-score support
0 0.91 0.57 0.70 74
1 0.69 0.95 0.80 76
accuracy 0.76 150
macro avg 0.80 0.76 0.75 150
weighted avg 0.80 0.76 0.75 150
ROC curve :
Visualize Confusion Matrix : <sklearn.metrics._plot.confusion_matrix.ConfusionMatrixDisplay object at 0x2ace69a90>
CART Y predicted : [0 1 1 1 1 0 1 1 0 0 0 1 1 1 0 1 1 1 0 0 0 1 0 0 0 1 0 1 0 0 1 0 0 0 1 1 1
1 1 0 1 0 1 0 0 0 0 1 1 0 1 0 1 0 0 0 1 0 0 1 0 1 1 1 1 0 0 1 1 0 0 0 1 1
0 1 1 0 1 1 1 0 1 0 1 1 1 0 1 0 0 1 1 0 0 1 0 1 0 0 0 1 0 1 0 1 0 0 1 0 1
1 0 0 1 1 1 1 0 0 1 0 0 0 1 0 0 1 1 1 1 0 1 0 1 1 1 1 1 1 0 0 0 0 1 0 1 1
0 0]
CART Y probability predicted : [0.0, 1.0, 1.0, 1.0, 1.0]
Accuracy Score :
0.98
AUC Score :
0.9799075391180655
Confusion Matrix :
[[72 2]
[ 1 75]]
Classification Report :
precision recall f1-score support
0 0.99 0.97 0.98 74
1 0.97 0.99 0.98 76
accuracy 0.98 150
macro avg 0.98 0.98 0.98 150
weighted avg 0.98 0.98 0.98 150
ROC curve :
Visualize Confusion Matrix : <sklearn.metrics._plot.confusion_matrix.ConfusionMatrixDisplay object at 0x2adcde100>
RFC Y predicted : [0 1 1 1 1 0 1 1 0 0 0 1 1 1 0 1 1 1 0 0 0 1 0 0 0 1 0 1 0 0 1 0 0 0 1 1 1
1 1 0 1 0 1 0 0 0 0 1 1 0 1 0 1 0 0 0 1 0 0 1 0 1 1 1 1 0 0 1 1 0 0 0 1 1
0 1 1 0 1 1 1 0 1 0 1 1 1 0 1 0 0 1 1 0 0 1 0 1 0 0 0 1 0 1 0 1 0 0 1 0 1
0 0 0 1 1 1 1 0 0 1 0 0 1 1 0 0 1 1 1 1 0 1 0 1 1 1 1 1 0 0 0 0 0 1 0 1 1
0 0]
RFC Y probability predicted : [0.01, 0.96, 0.94, 1.0, 1.0]
Accuracy Score :
1.0
AUC Score :
1.0
Confusion Matrix :
[[74 0]
[ 0 76]]
Classification Report :
precision recall f1-score support
0 1.00 1.00 1.00 74
1 1.00 1.00 1.00 76
accuracy 1.00 150
macro avg 1.00 1.00 1.00 150
weighted avg 1.00 1.00 1.00 150
ROC curve :
Visualize Confusion Matrix : <sklearn.metrics._plot.confusion_matrix.ConfusionMatrixDisplay object at 0x2ad6b3670>
[19:51:49] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior.
XGB Y predicted : [0 1 1 1 1 0 1 1 0 0 0 1 1 1 0 1 1 1 0 0 0 1 0 0 0 1 0 1 0 0 1 0 0 0 1 1 1
1 1 0 1 0 1 0 0 0 0 1 1 0 1 0 1 0 0 0 1 0 0 1 0 1 1 1 1 0 0 1 1 0 0 0 1 1
0 1 1 0 1 1 1 0 1 0 1 1 1 0 1 0 0 1 1 0 0 1 0 1 0 0 0 1 0 1 0 1 0 0 1 0 1
0 0 0 1 1 1 1 0 0 1 0 0 1 1 0 0 1 1 1 1 0 1 0 1 1 1 1 1 0 0 0 0 0 1 0 1 1
0 0]
XGB Y probability predicted : [0.003855781, 0.99720335, 0.9504216, 0.9980855, 0.99754983]
Accuracy Score :
1.0
AUC Score :
1.0
Confusion Matrix :
[[74 0]
[ 0 76]]
Classification Report :
precision recall f1-score support
0 1.00 1.00 1.00 74
1 1.00 1.00 1.00 76
accuracy 1.00 150
macro avg 1.00 1.00 1.00 150
weighted avg 1.00 1.00 1.00 150
ROC curve :
Visualize Confusion Matrix : <sklearn.metrics._plot.confusion_matrix.ConfusionMatrixDisplay object at 0x2ad78aee0>
NB Y predicted : [0 1 1 1 1 0 1 1 0 0 0 1 1 1 0 1 1 1 0 0 0 1 0 0 0 1 0 1 0 0 1 0 0 0 1 1 1
1 1 0 1 0 1 0 0 0 0 1 1 1 1 0 1 0 0 0 1 0 0 1 0 1 1 1 1 0 0 1 1 0 0 0 1 1
0 1 1 0 1 1 1 0 1 0 1 1 1 0 1 0 1 1 1 0 0 1 0 1 0 0 0 1 0 1 0 1 0 0 1 0 1
0 0 0 1 1 1 1 0 1 1 0 0 1 1 0 0 1 1 1 1 0 1 0 1 1 1 1 1 0 0 0 0 0 1 0 1 1
0 0]
NB Y probability predicted : [0.0, 1.0, 1.0, 1.0, 0.9999999999999982]
Accuracy Score :
0.98
AUC Score :
1.0
Confusion Matrix :
[[71 3]
[ 0 76]]
Classification Report :
precision recall f1-score support
0 1.00 0.96 0.98 74
1 0.96 1.00 0.98 76
accuracy 0.98 150
macro avg 0.98 0.98 0.98 150
weighted avg 0.98 0.98 0.98 150
ROC curve :
Visualize Confusion Matrix : <sklearn.metrics._plot.confusion_matrix.ConfusionMatrixDisplay object at 0x2ad5a9a30>
LGB Y predicted : [0 1 1 1 1 0 1 1 0 0 0 1 1 1 0 1 1 1 0 0 0 1 0 0 0 1 0 1 0 0 1 0 0 0 1 1 1
1 1 0 1 0 1 0 0 0 0 1 1 0 1 0 1 0 0 0 1 0 0 1 0 1 1 1 1 0 0 1 1 0 0 0 1 1
0 1 1 0 1 1 1 0 1 0 1 1 1 0 1 0 0 1 1 0 0 1 0 1 0 0 0 1 0 1 0 1 0 0 1 0 1
0 0 0 1 1 1 1 0 0 1 0 0 1 1 0 0 1 1 1 1 0 1 0 1 1 1 1 1 1 0 0 0 0 1 0 1 1
0 0]
LGB Y probability predicted : [1.794165376463441e-05, 0.9998377588379368, 0.9995012658447973, 0.999964317268651, 0.9999608652385723]
Accuracy Score :
0.9933333333333333
AUC Score :
1.0
Confusion Matrix :
[[73 1]
[ 0 76]]
Classification Report :
precision recall f1-score support
0 1.00 0.99 0.99 74
1 0.99 1.00 0.99 76
accuracy 0.99 150
macro avg 0.99 0.99 0.99 150
weighted avg 0.99 0.99 0.99 150
ROC curve :
Visualize Confusion Matrix : <sklearn.metrics._plot.confusion_matrix.ConfusionMatrixDisplay object at 0x2adaee880>
#alg comp using accuracy results
acc_results = []
acc_names = []
for name, model in models:
kfold = KFold(n_splits=num_folds)
cv_results = cross_val_score(model, X_train, y_train, cv=kfold, scoring='accuracy')
acc_results.append(cv_results)
acc_names.append(name)
msg = "%s,accuracy: %f (%f)" % (name, cv_results.mean(), cv_results.std())
print(msg)
fig = plt.figure()
fig.suptitle('Algorithm Comparison with accuracy')
ax = fig.add_subplot(111)
plt.boxplot(acc_results)
ax.set_xticklabels(acc_names)
plt.show()
LR,accuracy: 1.000000 (0.000000)
LDA,accuracy: 0.953571 (0.027894)
KNN,accuracy: 0.964286 (0.027664)
CART,accuracy: 0.957143 (0.041650)
RFC,accuracy: 0.985714 (0.023690)
[13:30:18] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [13:30:18] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [13:30:18] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [13:30:18] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [13:30:19] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [13:30:19] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [13:30:19] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [13:30:19] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [13:30:19] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [13:30:19] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. XGB,accuracy: 0.975000 (0.022868)
NB,accuracy: 0.960714 (0.037287)
LGB,accuracy: 0.978571 (0.017496)
#alg comp using roc results
roc_results = []
roc_names = []
for name, model in models:
kfold = KFold(n_splits=num_folds)
cv_results = cross_val_score(model, X_train, y_train, cv=kfold, scoring='roc_auc')
roc_results.append(cv_results)
roc_names.append(name)
msg = "%s,roc: %f (%f)" % (name, cv_results.mean(), cv_results.std())
print(msg)
fig = plt.figure()
fig.suptitle('Algorithm Comparison with roc')
ax = fig.add_subplot(111)
plt.boxplot(roc_results)
ax.set_xticklabels(roc_names)
plt.show()
LR,roc: 1.000000 (0.000000)
LDA,roc: 0.994787 (0.008848)
KNN,roc: 0.999745 (0.000765)
CART,roc: 0.968926 (0.033248)
RFC,roc: 0.999490 (0.001531)
[13:30:26] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [13:30:26] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [13:30:26] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [13:30:27] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [13:30:27] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [13:30:27] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [13:30:27] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [13:30:27] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [13:30:27] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [13:30:27] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. XGB,roc: 0.995695 (0.007977)
NB,roc: 0.967389 (0.032010)
LGB,roc: 0.998905 (0.002196)
# RFC feature importance
model_rfc = RandomForestClassifier()
# fit the model
model_rfc.fit(X, y)
# get importance
importance = model_rfc.feature_importances_
# summarize feature importance
for i,v in enumerate(importance):
print( 'RFC Feature: %s, Score: %.5f' % (df.columns[i],v))
# plot feature importance
plt.bar([df.columns[x] for x in range(len(importance))], importance)
plt.xticks(df.columns[:-1], rotation='vertical')
plt.show()
RFC Feature: age, Score: 0.00854 RFC Feature: blood_pressure, Score: 0.01148 RFC Feature: specific_gravity, Score: 0.09781 RFC Feature: albumin, Score: 0.05770 RFC Feature: sugar, Score: 0.00795 RFC Feature: red_blood_cells, Score: 0.00236 RFC Feature: pus_cell, Score: 0.00488 RFC Feature: pus_cell_clumps, Score: 0.00025 RFC Feature: bacteria, Score: 0.00011 RFC Feature: blood_glucose_random, Score: 0.03102 RFC Feature: blood_urea, Score: 0.03071 RFC Feature: serum_creatinine, Score: 0.13150 RFC Feature: sodium, Score: 0.02908 RFC Feature: potassium, Score: 0.00618 RFC Feature: haemoglobin, Score: 0.20300 RFC Feature: packed_cell_volume, Score: 0.16688 RFC Feature: white_blood_cell_count, Score: 0.00563 RFC Feature: red_blood_cell_count, Score: 0.10408 RFC Feature: ypertension, Score: 0.04409 RFC Feature: diabetes_mellitus, Score: 0.03538 RFC Feature: coronary_artery_disease, Score: 0.00000 RFC Feature: appetite, Score: 0.00993 RFC Feature: pedal_edema, Score: 0.00878 RFC Feature: anemia, Score: 0.00264
# Split the dataset into the training set and test set after cat col encoder
no_scale_X_train, no_scale_X_test, no_scale_y_train, no_scale_y_test = train_test_split(X, y, test_size = 0.3, random_state = 0)
# feature selection
def select_features(no_scale_X_train, no_scale_y_train):
fs = SelectKBest(score_func=chi2, k= 20)
ordered_feature=fs.fit(no_scale_X_train, no_scale_y_train)
return ordered_feature
ordered_feature=select_features(no_scale_X_train, no_scale_y_train)
no_scale_X_train
| age | blood_pressure | specific_gravity | albumin | sugar | red_blood_cells | pus_cell | pus_cell_clumps | bacteria | blood_glucose_random | blood_urea | serum_creatinine | sodium | potassium | haemoglobin | packed_cell_volume | white_blood_cell_count | red_blood_cell_count | ypertension | diabetes_mellitus | coronary_artery_disease | appetite | pedal_edema | anemia | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 92 | 71.0 | 70.0 | 1.010 | 3.0 | 0.0 | 1 | 0 | 1 | 1 | 219.0 | 82.0 | 3.60 | 133.0 | 4.4 | 10.40 | 33.0 | 5600.0 | 3.6 | 1 | 1 | 1 | 0 | 0 | 0 |
| 223 | 71.0 | 90.0 | 1.010 | 0.0 | 3.0 | 1 | 1 | 0 | 0 | 303.0 | 30.0 | 1.30 | 136.0 | 4.1 | 13.00 | 38.0 | 9200.0 | 4.6 | 1 | 1 | 0 | 0 | 0 | 0 |
| 234 | 37.0 | 100.0 | 1.010 | 0.0 | 0.0 | 0 | 1 | 0 | 0 | 121.0 | 19.0 | 1.30 | 138.0 | 4.4 | 15.00 | 44.0 | 4100.0 | 5.2 | 1 | 0 | 0 | 0 | 0 | 0 |
| 232 | 50.0 | 90.0 | 1.015 | 1.0 | 0.0 | 0 | 0 | 0 | 0 | 121.0 | 42.0 | 1.30 | 138.0 | 4.4 | 12.65 | 40.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 0 | 1 | 0 |
| 377 | 64.0 | 70.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 97.0 | 27.0 | 0.70 | 145.0 | 4.8 | 13.80 | 49.0 | 6400.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 142 | 72.0 | 90.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 84.0 | 145.0 | 7.10 | 135.0 | 5.3 | 12.65 | 40.0 | 8000.0 | 4.8 | 0 | 1 | 0 | 0 | 0 | 0 |
| 22 | 48.0 | 80.0 | 1.025 | 4.0 | 0.0 | 1 | 0 | 0 | 0 | 95.0 | 163.0 | 7.70 | 136.0 | 3.8 | 9.80 | 32.0 | 6900.0 | 3.4 | 1 | 0 | 0 | 0 | 0 | 1 |
| 252 | 45.0 | 80.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 82.0 | 49.0 | 0.60 | 147.0 | 4.4 | 15.90 | 46.0 | 9100.0 | 4.7 | 0 | 0 | 0 | 0 | 0 | 0 |
| 350 | 65.0 | 70.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 85.0 | 20.0 | 1.00 | 142.0 | 4.8 | 16.10 | 43.0 | 9600.0 | 4.5 | 0 | 0 | 0 | 0 | 0 | 0 |
| 168 | 65.0 | 70.0 | 1.015 | 4.0 | 4.0 | 1 | 1 | 1 | 0 | 307.0 | 28.0 | 1.50 | 138.0 | 4.4 | 11.00 | 39.0 | 6700.0 | 4.8 | 1 | 1 | 0 | 0 | 0 | 0 |
| 150 | 8.0 | 60.0 | 1.025 | 3.0 | 0.0 | 1 | 1 | 0 | 0 | 78.0 | 27.0 | 0.90 | 138.0 | 4.4 | 12.30 | 41.0 | 6700.0 | 4.8 | 0 | 0 | 0 | 1 | 1 | 0 |
| 393 | 43.0 | 60.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 117.0 | 45.0 | 0.70 | 141.0 | 4.4 | 13.00 | 54.0 | 7400.0 | 5.4 | 0 | 0 | 0 | 0 | 0 | 0 |
| 66 | 67.0 | 70.0 | 1.020 | 2.0 | 0.0 | 0 | 1 | 0 | 0 | 150.0 | 55.0 | 1.60 | 131.0 | 4.8 | 12.65 | 40.0 | 8000.0 | 4.8 | 1 | 1 | 0 | 0 | 1 | 0 |
| 240 | 65.0 | 70.0 | 1.015 | 1.0 | 0.0 | 1 | 1 | 0 | 0 | 203.0 | 46.0 | 1.40 | 138.0 | 4.4 | 11.40 | 36.0 | 5000.0 | 4.1 | 1 | 1 | 0 | 1 | 1 | 0 |
| 218 | 33.0 | 90.0 | 1.015 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 92.0 | 19.0 | 0.80 | 138.0 | 4.4 | 11.80 | 34.0 | 7000.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 101 | 71.0 | 90.0 | 1.015 | 2.0 | 0.0 | 1 | 0 | 1 | 1 | 88.0 | 80.0 | 4.40 | 139.0 | 5.7 | 11.30 | 33.0 | 10700.0 | 3.9 | 0 | 0 | 0 | 0 | 0 | 0 |
| 311 | 56.0 | 60.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 132.0 | 18.0 | 1.10 | 147.0 | 4.7 | 13.70 | 45.0 | 7500.0 | 5.6 | 0 | 0 | 0 | 0 | 0 | 0 |
| 194 | 80.0 | 70.0 | 1.010 | 2.0 | 0.0 | 1 | 0 | 0 | 0 | 121.0 | 49.0 | 1.20 | 138.0 | 4.4 | 12.65 | 40.0 | 8000.0 | 4.8 | 1 | 1 | 0 | 0 | 0 | 0 |
| 326 | 47.0 | 60.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 109.0 | 25.0 | 1.10 | 141.0 | 4.7 | 15.80 | 41.0 | 8300.0 | 5.2 | 0 | 0 | 0 | 0 | 0 | 0 |
| 17 | 47.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 114.0 | 87.0 | 5.20 | 139.0 | 3.7 | 12.10 | 40.0 | 8000.0 | 4.8 | 1 | 0 | 0 | 1 | 0 | 0 |
| 164 | 14.0 | 80.0 | 1.015 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 192.0 | 15.0 | 0.80 | 137.0 | 4.2 | 14.30 | 40.0 | 9500.0 | 5.4 | 0 | 1 | 0 | 1 | 1 | 0 |
| 186 | 8.0 | 50.0 | 1.020 | 4.0 | 0.0 | 1 | 1 | 0 | 0 | 121.0 | 46.0 | 1.00 | 135.0 | 3.8 | 12.65 | 40.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 0 | 1 | 0 |
| 30 | 55.0 | 70.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 93.0 | 155.0 | 7.30 | 132.0 | 4.9 | 12.65 | 40.0 | 8000.0 | 4.8 | 1 | 1 | 0 | 0 | 0 | 0 |
| 114 | 12.0 | 60.0 | 1.015 | 3.0 | 0.0 | 0 | 0 | 1 | 0 | 121.0 | 51.0 | 1.80 | 138.0 | 4.4 | 12.10 | 40.0 | 10300.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 263 | 45.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 117.0 | 46.0 | 1.20 | 137.0 | 5.0 | 16.20 | 45.0 | 8600.0 | 5.2 | 0 | 0 | 0 | 0 | 0 | 0 |
| 103 | 76.0 | 70.0 | 1.015 | 2.0 | 0.0 | 1 | 0 | 1 | 0 | 226.0 | 217.0 | 10.20 | 138.0 | 4.4 | 10.20 | 36.0 | 12700.0 | 4.2 | 1 | 0 | 0 | 1 | 1 | 1 |
| 358 | 47.0 | 60.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 117.0 | 22.0 | 1.20 | 138.0 | 3.5 | 13.00 | 45.0 | 5200.0 | 5.6 | 0 | 0 | 0 | 0 | 0 | 0 |
| 245 | 48.0 | 100.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 103.0 | 79.0 | 5.30 | 135.0 | 6.3 | 6.30 | 19.0 | 7200.0 | 2.6 | 1 | 0 | 1 | 1 | 0 | 0 |
| 235 | 45.0 | 70.0 | 1.010 | 2.0 | 0.0 | 1 | 1 | 0 | 0 | 113.0 | 93.0 | 2.30 | 138.0 | 4.4 | 7.90 | 26.0 | 5700.0 | 4.8 | 0 | 0 | 1 | 0 | 0 | 1 |
| 116 | 55.0 | 70.0 | 1.015 | 4.0 | 0.0 | 0 | 1 | 0 | 0 | 104.0 | 16.0 | 0.50 | 138.0 | 4.4 | 12.65 | 40.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 0 | 1 | 0 |
| 330 | 43.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 114.0 | 32.0 | 1.10 | 135.0 | 3.9 | 12.65 | 42.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 120 | 72.0 | 90.0 | 1.025 | 1.0 | 3.0 | 1 | 1 | 0 | 0 | 323.0 | 40.0 | 2.20 | 137.0 | 5.3 | 12.60 | 40.0 | 8000.0 | 4.8 | 0 | 1 | 1 | 1 | 0 | 0 |
| 289 | 42.0 | 70.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 93.0 | 32.0 | 0.90 | 143.0 | 4.7 | 16.60 | 43.0 | 7100.0 | 5.3 | 0 | 0 | 0 | 0 | 0 | 0 |
| 112 | 55.0 | 60.0 | 1.015 | 3.0 | 0.0 | 0 | 0 | 0 | 0 | 121.0 | 34.0 | 1.20 | 138.0 | 4.4 | 10.80 | 33.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 215 | 2.0 | 80.0 | 1.010 | 3.0 | 0.0 | 1 | 0 | 0 | 0 | 121.0 | 42.0 | 1.30 | 138.0 | 4.4 | 12.65 | 40.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 0 | 1 | 0 |
| 136 | 46.0 | 90.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 213.0 | 68.0 | 2.80 | 146.0 | 6.3 | 9.30 | 40.0 | 8000.0 | 4.8 | 1 | 1 | 0 | 0 | 0 | 0 |
| 275 | 52.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 125.0 | 22.0 | 1.20 | 139.0 | 4.6 | 16.50 | 43.0 | 4700.0 | 4.6 | 0 | 0 | 0 | 0 | 0 | 0 |
| 126 | 70.0 | 90.0 | 1.015 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 144.0 | 125.0 | 4.00 | 136.0 | 4.6 | 12.00 | 37.0 | 8200.0 | 4.5 | 1 | 1 | 0 | 1 | 1 | 0 |
| 198 | 59.0 | 100.0 | 1.020 | 4.0 | 2.0 | 1 | 1 | 0 | 0 | 252.0 | 40.0 | 3.20 | 137.0 | 4.7 | 11.20 | 30.0 | 26400.0 | 3.9 | 1 | 1 | 0 | 1 | 1 | 0 |
| 299 | 73.0 | 60.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 127.0 | 48.0 | 0.50 | 150.0 | 3.5 | 15.10 | 52.0 | 11000.0 | 4.7 | 0 | 0 | 0 | 0 | 0 | 0 |
| 281 | 55.0 | 80.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 130.0 | 50.0 | 1.20 | 147.0 | 5.0 | 15.50 | 41.0 | 9100.0 | 6.0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 133 | 70.0 | 100.0 | 1.015 | 4.0 | 0.0 | 1 | 1 | 0 | 0 | 118.0 | 125.0 | 5.30 | 136.0 | 4.9 | 12.00 | 37.0 | 8400.0 | 8.0 | 1 | 0 | 0 | 0 | 0 | 0 |
| 33 | 60.0 | 100.0 | 1.020 | 2.0 | 0.0 | 0 | 0 | 0 | 0 | 140.0 | 55.0 | 2.50 | 138.0 | 4.4 | 10.10 | 29.0 | 8000.0 | 4.8 | 1 | 0 | 0 | 1 | 0 | 0 |
| 378 | 71.0 | 60.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 121.0 | 42.0 | 0.90 | 140.0 | 4.8 | 15.20 | 42.0 | 7700.0 | 5.5 | 0 | 0 | 0 | 0 | 0 | 0 |
| 162 | 59.0 | 70.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 204.0 | 34.0 | 1.50 | 124.0 | 4.1 | 9.80 | 37.0 | 6000.0 | 4.8 | 0 | 1 | 0 | 0 | 0 | 0 |
| 34 | 70.0 | 70.0 | 1.010 | 1.0 | 0.0 | 1 | 1 | 1 | 1 | 171.0 | 153.0 | 5.20 | 138.0 | 4.4 | 12.65 | 40.0 | 8000.0 | 4.8 | 0 | 1 | 0 | 1 | 0 | 0 |
| 231 | 60.0 | 90.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 269.0 | 51.0 | 2.80 | 138.0 | 3.7 | 11.50 | 35.0 | 8000.0 | 4.8 | 1 | 1 | 1 | 0 | 1 | 0 |
| 97 | 65.0 | 60.0 | 1.015 | 1.0 | 0.0 | 1 | 1 | 0 | 0 | 91.0 | 51.0 | 2.20 | 132.0 | 3.8 | 10.00 | 32.0 | 9100.0 | 4.0 | 1 | 1 | 0 | 1 | 1 | 0 |
| 85 | 70.0 | 70.0 | 1.015 | 2.0 | 0.0 | 1 | 1 | 0 | 0 | 121.0 | 46.0 | 1.50 | 138.0 | 4.4 | 9.90 | 40.0 | 8000.0 | 4.8 | 0 | 1 | 0 | 1 | 1 | 0 |
| 61 | 67.0 | 80.0 | 1.010 | 1.0 | 3.0 | 1 | 0 | 0 | 0 | 182.0 | 391.0 | 32.00 | 163.0 | 39.0 | 12.65 | 40.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 0 | 1 | 0 |
| 167 | 34.0 | 70.0 | 1.020 | 0.0 | 0.0 | 0 | 1 | 0 | 0 | 139.0 | 19.0 | 0.90 | 138.0 | 4.4 | 12.70 | 42.0 | 2200.0 | 4.8 | 0 | 0 | 0 | 1 | 0 | 0 |
| 282 | 20.0 | 70.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 123.0 | 44.0 | 1.00 | 135.0 | 3.8 | 14.60 | 44.0 | 5500.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 200 | 90.0 | 90.0 | 1.025 | 1.0 | 0.0 | 1 | 1 | 0 | 0 | 139.0 | 89.0 | 3.00 | 140.0 | 4.1 | 12.00 | 37.0 | 7900.0 | 3.9 | 1 | 1 | 0 | 0 | 0 | 0 |
| 391 | 36.0 | 80.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 85.0 | 16.0 | 1.10 | 142.0 | 4.1 | 15.60 | 44.0 | 5800.0 | 6.3 | 0 | 0 | 0 | 0 | 0 | 0 |
| 230 | 65.0 | 60.0 | 1.010 | 2.0 | 0.0 | 1 | 0 | 1 | 0 | 192.0 | 17.0 | 1.70 | 130.0 | 4.3 | 12.65 | 40.0 | 9500.0 | 4.8 | 1 | 1 | 0 | 1 | 0 | 0 |
| 287 | 39.0 | 70.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 124.0 | 22.0 | 0.60 | 137.0 | 3.8 | 13.40 | 43.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 108 | 45.0 | 80.0 | 1.015 | 0.0 | 0.0 | 1 | 0 | 0 | 0 | 107.0 | 15.0 | 1.00 | 141.0 | 4.2 | 11.80 | 37.0 | 10200.0 | 4.2 | 0 | 0 | 0 | 0 | 0 | 0 |
| 46 | 48.0 | 70.0 | 1.015 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 124.0 | 24.0 | 1.20 | 142.0 | 4.2 | 12.40 | 37.0 | 6400.0 | 4.7 | 0 | 1 | 0 | 0 | 0 | 0 |
| 320 | 57.0 | 60.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 105.0 | 49.0 | 1.20 | 150.0 | 4.7 | 15.70 | 44.0 | 10400.0 | 6.2 | 0 | 0 | 0 | 0 | 0 | 0 |
| 396 | 42.0 | 70.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 75.0 | 31.0 | 1.20 | 141.0 | 3.5 | 16.50 | 54.0 | 7800.0 | 6.2 | 0 | 0 | 0 | 0 | 0 | 0 |
| 224 | 34.0 | 60.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 117.0 | 28.0 | 2.20 | 138.0 | 3.8 | 12.65 | 40.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 0 | 1 | 0 |
| 73 | 55.0 | 100.0 | 1.015 | 2.0 | 0.0 | 0 | 0 | 0 | 0 | 129.0 | 107.0 | 6.70 | 132.0 | 4.4 | 4.80 | 14.0 | 6300.0 | 4.8 | 1 | 0 | 0 | 0 | 1 | 1 |
| 137 | 45.0 | 60.0 | 1.010 | 2.0 | 0.0 | 1 | 0 | 1 | 0 | 268.0 | 86.0 | 4.00 | 134.0 | 5.1 | 10.00 | 29.0 | 9200.0 | 4.8 | 1 | 1 | 0 | 0 | 0 | 0 |
| 381 | 71.0 | 70.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 79.0 | 47.0 | 0.50 | 142.0 | 4.8 | 16.60 | 40.0 | 5800.0 | 5.9 | 0 | 0 | 0 | 0 | 0 | 0 |
| 220 | 36.0 | 80.0 | 1.010 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 103.0 | 42.0 | 1.30 | 138.0 | 4.4 | 11.90 | 36.0 | 8800.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 210 | 59.0 | 100.0 | 1.015 | 4.0 | 2.0 | 1 | 1 | 0 | 0 | 255.0 | 132.0 | 12.80 | 135.0 | 5.7 | 7.30 | 20.0 | 9800.0 | 3.9 | 1 | 1 | 1 | 0 | 0 | 1 |
| 29 | 68.0 | 70.0 | 1.005 | 1.0 | 0.0 | 0 | 0 | 1 | 0 | 121.0 | 28.0 | 1.40 | 138.0 | 4.4 | 12.90 | 38.0 | 8000.0 | 4.8 | 0 | 0 | 1 | 0 | 0 | 0 |
| 181 | 45.0 | 70.0 | 1.025 | 2.0 | 0.0 | 1 | 0 | 1 | 0 | 117.0 | 52.0 | 2.20 | 136.0 | 3.8 | 10.00 | 30.0 | 19100.0 | 3.7 | 0 | 0 | 0 | 0 | 0 | 0 |
| 360 | 35.0 | 60.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 105.0 | 39.0 | 0.50 | 135.0 | 3.9 | 14.70 | 43.0 | 5800.0 | 6.2 | 0 | 0 | 0 | 0 | 0 | 0 |
| 271 | 30.0 | 80.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 96.0 | 25.0 | 0.50 | 144.0 | 4.8 | 13.80 | 42.0 | 9000.0 | 4.5 | 0 | 0 | 0 | 0 | 0 | 0 |
| 51 | 54.0 | 100.0 | 1.015 | 3.0 | 0.0 | 1 | 1 | 1 | 0 | 162.0 | 66.0 | 1.60 | 136.0 | 4.4 | 10.30 | 33.0 | 8000.0 | 4.8 | 1 | 1 | 0 | 1 | 1 | 0 |
| 328 | 28.0 | 70.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 131.0 | 29.0 | 0.60 | 145.0 | 4.9 | 12.65 | 45.0 | 8600.0 | 6.5 | 0 | 0 | 0 | 0 | 0 | 0 |
| 352 | 37.0 | 60.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 109.0 | 47.0 | 1.10 | 141.0 | 4.9 | 15.00 | 48.0 | 7000.0 | 5.2 | 0 | 0 | 0 | 0 | 0 | 0 |
| 27 | 69.0 | 70.0 | 1.010 | 3.0 | 4.0 | 1 | 0 | 0 | 0 | 264.0 | 87.0 | 2.70 | 130.0 | 4.0 | 12.50 | 37.0 | 9600.0 | 4.1 | 1 | 1 | 1 | 0 | 1 | 0 |
| 2 | 62.0 | 80.0 | 1.010 | 2.0 | 3.0 | 1 | 1 | 0 | 0 | 423.0 | 53.0 | 1.80 | 138.0 | 4.4 | 9.60 | 31.0 | 7500.0 | 4.8 | 0 | 1 | 0 | 1 | 0 | 1 |
| 217 | 63.0 | 100.0 | 1.010 | 1.0 | 0.0 | 1 | 1 | 0 | 0 | 78.0 | 61.0 | 1.80 | 141.0 | 4.4 | 12.20 | 36.0 | 10500.0 | 4.3 | 0 | 1 | 0 | 0 | 0 | 0 |
| 156 | 66.0 | 90.0 | 1.015 | 2.0 | 0.0 | 1 | 1 | 0 | 1 | 153.0 | 76.0 | 3.30 | 138.0 | 4.4 | 12.65 | 40.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 1 | 0 | 0 |
| 212 | 40.0 | 70.0 | 1.015 | 3.0 | 4.0 | 1 | 1 | 0 | 0 | 253.0 | 150.0 | 11.90 | 132.0 | 5.6 | 10.90 | 31.0 | 8800.0 | 3.4 | 1 | 1 | 0 | 1 | 1 | 0 |
| 376 | 58.0 | 70.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 88.0 | 16.0 | 1.10 | 147.0 | 3.5 | 16.40 | 53.0 | 9100.0 | 5.2 | 0 | 0 | 0 | 0 | 0 | 0 |
| 221 | 66.0 | 70.0 | 1.020 | 1.0 | 0.0 | 1 | 1 | 0 | 0 | 248.0 | 30.0 | 1.70 | 138.0 | 5.3 | 12.65 | 40.0 | 8000.0 | 4.8 | 1 | 1 | 0 | 0 | 0 | 0 |
| 138 | 73.0 | 80.0 | 1.010 | 1.0 | 0.0 | 1 | 1 | 0 | 0 | 95.0 | 51.0 | 1.60 | 142.0 | 3.5 | 12.65 | 40.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 236 | 65.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 74.0 | 66.0 | 2.00 | 136.0 | 5.4 | 9.10 | 25.0 | 8000.0 | 4.8 | 1 | 1 | 1 | 0 | 1 | 0 |
| 219 | 68.0 | 90.0 | 1.010 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 238.0 | 57.0 | 2.50 | 138.0 | 4.4 | 9.80 | 28.0 | 8000.0 | 3.3 | 1 | 1 | 0 | 1 | 0 | 0 |
| 274 | 19.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 107.0 | 23.0 | 0.70 | 141.0 | 4.2 | 14.40 | 44.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 278 | 48.0 | 60.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 112.0 | 44.0 | 1.20 | 142.0 | 4.9 | 14.50 | 44.0 | 9400.0 | 6.4 | 0 | 0 | 0 | 0 | 0 | 0 |
| 307 | 47.0 | 60.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 137.0 | 17.0 | 0.50 | 150.0 | 3.5 | 13.60 | 44.0 | 7900.0 | 4.5 | 0 | 0 | 0 | 0 | 0 | 0 |
| 239 | 34.0 | 90.0 | 1.015 | 2.0 | 0.0 | 1 | 1 | 0 | 0 | 104.0 | 50.0 | 1.60 | 137.0 | 4.1 | 11.90 | 39.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 35 | 65.0 | 90.0 | 1.020 | 2.0 | 1.0 | 0 | 1 | 0 | 0 | 270.0 | 39.0 | 2.00 | 138.0 | 4.4 | 12.00 | 36.0 | 9800.0 | 4.9 | 1 | 1 | 0 | 1 | 0 | 1 |
| 204 | 65.0 | 90.0 | 1.010 | 4.0 | 2.0 | 1 | 1 | 0 | 0 | 172.0 | 82.0 | 13.50 | 145.0 | 6.3 | 8.80 | 31.0 | 8000.0 | 4.8 | 1 | 1 | 0 | 0 | 1 | 1 |
| 392 | 57.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 133.0 | 48.0 | 1.20 | 147.0 | 4.3 | 14.80 | 46.0 | 6600.0 | 5.5 | 0 | 0 | 0 | 0 | 0 | 0 |
| 67 | 45.0 | 80.0 | 1.020 | 3.0 | 0.0 | 1 | 0 | 0 | 0 | 425.0 | 42.0 | 1.30 | 138.0 | 4.4 | 12.65 | 40.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 1 | 0 | 0 |
| 24 | 42.0 | 100.0 | 1.015 | 4.0 | 0.0 | 1 | 0 | 0 | 1 | 121.0 | 50.0 | 1.40 | 129.0 | 4.0 | 11.10 | 39.0 | 8300.0 | 4.6 | 1 | 0 | 0 | 1 | 0 | 0 |
| 332 | 34.0 | 70.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 121.0 | 33.0 | 1.00 | 150.0 | 5.0 | 15.30 | 44.0 | 10500.0 | 6.1 | 0 | 0 | 0 | 0 | 0 | 0 |
| 44 | 54.0 | 80.0 | 1.010 | 3.0 | 0.0 | 0 | 0 | 0 | 0 | 207.0 | 77.0 | 6.30 | 134.0 | 4.8 | 9.70 | 28.0 | 8000.0 | 4.8 | 1 | 1 | 0 | 1 | 1 | 0 |
| 241 | 57.0 | 70.0 | 1.015 | 1.0 | 0.0 | 1 | 0 | 0 | 0 | 165.0 | 45.0 | 1.50 | 140.0 | 3.3 | 10.40 | 31.0 | 4200.0 | 3.9 | 0 | 0 | 0 | 0 | 0 | 0 |
| 129 | 75.0 | 70.0 | 1.025 | 1.0 | 0.0 | 1 | 1 | 0 | 0 | 158.0 | 49.0 | 1.40 | 135.0 | 4.7 | 11.10 | 40.0 | 8000.0 | 4.8 | 1 | 0 | 0 | 1 | 1 | 0 |
| 93 | 73.0 | 100.0 | 1.010 | 3.0 | 2.0 | 0 | 0 | 1 | 0 | 295.0 | 90.0 | 5.60 | 140.0 | 2.9 | 9.20 | 30.0 | 7000.0 | 3.2 | 1 | 1 | 1 | 1 | 0 | 0 |
| 111 | 65.0 | 80.0 | 1.010 | 3.0 | 3.0 | 1 | 1 | 0 | 0 | 294.0 | 71.0 | 4.40 | 128.0 | 5.4 | 10.00 | 32.0 | 9000.0 | 3.9 | 1 | 1 | 1 | 0 | 0 | 0 |
| 166 | 27.0 | 60.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 76.0 | 44.0 | 3.90 | 127.0 | 4.3 | 12.65 | 40.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 1 | 1 | 1 |
| 389 | 41.0 | 80.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 112.0 | 48.0 | 0.70 | 140.0 | 5.0 | 17.00 | 52.0 | 7200.0 | 5.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 383 | 80.0 | 80.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 119.0 | 46.0 | 0.70 | 141.0 | 4.9 | 13.90 | 49.0 | 5100.0 | 5.0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 342 | 44.0 | 60.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 95.0 | 46.0 | 0.50 | 138.0 | 4.2 | 15.00 | 50.0 | 7700.0 | 6.3 | 0 | 0 | 0 | 0 | 0 | 0 |
| 40 | 46.0 | 90.0 | 1.010 | 2.0 | 0.0 | 1 | 0 | 0 | 0 | 99.0 | 80.0 | 2.10 | 138.0 | 4.4 | 11.10 | 32.0 | 9100.0 | 4.1 | 1 | 0 | 0 | 0 | 0 | 0 |
| 18 | 60.0 | 100.0 | 1.025 | 0.0 | 3.0 | 1 | 1 | 0 | 0 | 263.0 | 27.0 | 1.30 | 135.0 | 4.3 | 12.70 | 37.0 | 11400.0 | 4.3 | 1 | 1 | 1 | 0 | 0 | 0 |
| 284 | 33.0 | 80.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 100.0 | 37.0 | 1.20 | 142.0 | 4.0 | 16.90 | 52.0 | 6700.0 | 6.0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 79 | 56.0 | 80.0 | 1.010 | 1.0 | 0.0 | 1 | 1 | 0 | 0 | 165.0 | 55.0 | 1.80 | 138.0 | 4.4 | 13.50 | 40.0 | 11800.0 | 5.0 | 1 | 1 | 0 | 1 | 1 | 0 |
| 249 | 56.0 | 90.0 | 1.010 | 4.0 | 1.0 | 1 | 0 | 1 | 0 | 176.0 | 309.0 | 13.30 | 124.0 | 6.5 | 3.10 | 9.0 | 5400.0 | 2.1 | 1 | 1 | 0 | 1 | 1 | 1 |
| 394 | 50.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 137.0 | 46.0 | 0.80 | 139.0 | 5.0 | 14.10 | 45.0 | 9500.0 | 4.6 | 0 | 0 | 0 | 0 | 0 | 0 |
| 71 | 46.0 | 60.0 | 1.010 | 1.0 | 0.0 | 1 | 1 | 0 | 0 | 163.0 | 92.0 | 3.30 | 141.0 | 4.0 | 9.80 | 28.0 | 14600.0 | 3.2 | 1 | 1 | 0 | 0 | 0 | 0 |
| 13 | 68.0 | 70.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 98.0 | 86.0 | 4.60 | 135.0 | 3.4 | 9.80 | 40.0 | 8000.0 | 4.8 | 1 | 1 | 1 | 1 | 1 | 0 |
| 367 | 68.0 | 60.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 125.0 | 41.0 | 1.10 | 139.0 | 3.8 | 17.40 | 50.0 | 6700.0 | 6.1 | 0 | 0 | 0 | 0 | 0 | 0 |
| 213 | 55.0 | 80.0 | 1.010 | 3.0 | 1.0 | 1 | 0 | 1 | 1 | 214.0 | 73.0 | 3.90 | 137.0 | 4.9 | 10.90 | 34.0 | 7400.0 | 3.7 | 1 | 1 | 0 | 0 | 1 | 0 |
| 385 | 63.0 | 70.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 113.0 | 25.0 | 0.60 | 146.0 | 4.9 | 16.50 | 52.0 | 8000.0 | 5.1 | 0 | 0 | 0 | 0 | 0 | 0 |
| 388 | 51.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 94.0 | 15.0 | 1.20 | 144.0 | 3.7 | 15.50 | 46.0 | 9500.0 | 6.4 | 0 | 0 | 0 | 0 | 0 | 0 |
| 228 | 60.0 | 70.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 124.0 | 52.0 | 2.50 | 138.0 | 4.4 | 12.65 | 40.0 | 8000.0 | 4.8 | 1 | 0 | 0 | 0 | 0 | 0 |
| 160 | 81.0 | 60.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 148.0 | 39.0 | 2.10 | 147.0 | 4.2 | 10.90 | 35.0 | 9400.0 | 2.4 | 1 | 1 | 1 | 1 | 1 | 0 |
| 104 | 55.0 | 90.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 143.0 | 88.0 | 2.00 | 138.0 | 4.4 | 12.65 | 40.0 | 8000.0 | 4.8 | 1 | 1 | 0 | 1 | 1 | 0 |
| 161 | 62.0 | 80.0 | 1.015 | 3.0 | 0.0 | 0 | 1 | 0 | 0 | 121.0 | 42.0 | 1.30 | 138.0 | 4.4 | 14.30 | 42.0 | 10200.0 | 4.8 | 1 | 1 | 0 | 0 | 0 | 0 |
| 83 | 48.0 | 70.0 | 1.015 | 1.0 | 0.0 | 1 | 1 | 0 | 0 | 127.0 | 19.0 | 1.00 | 134.0 | 3.6 | 12.65 | 40.0 | 8000.0 | 4.8 | 1 | 1 | 0 | 0 | 0 | 0 |
| 189 | 64.0 | 60.0 | 1.010 | 4.0 | 1.0 | 0 | 0 | 0 | 1 | 239.0 | 58.0 | 4.30 | 137.0 | 5.4 | 9.50 | 29.0 | 7500.0 | 3.4 | 1 | 1 | 0 | 1 | 1 | 0 |
| 397 | 12.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 100.0 | 26.0 | 0.60 | 137.0 | 4.4 | 15.80 | 49.0 | 6600.0 | 5.4 | 0 | 0 | 0 | 0 | 0 | 0 |
| 118 | 55.0 | 70.0 | 1.010 | 3.0 | 0.0 | 1 | 1 | 0 | 0 | 99.0 | 25.0 | 1.20 | 138.0 | 4.4 | 11.40 | 40.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 1 | 1 | 0 |
| 254 | 51.0 | 60.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 99.0 | 38.0 | 0.80 | 135.0 | 3.7 | 13.00 | 49.0 | 8300.0 | 5.2 | 0 | 0 | 0 | 0 | 0 | 0 |
| 188 | 8.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 80.0 | 66.0 | 2.50 | 142.0 | 3.6 | 12.20 | 38.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 208 | 67.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 341.0 | 37.0 | 1.50 | 138.0 | 4.4 | 12.30 | 41.0 | 6900.0 | 4.9 | 1 | 1 | 0 | 0 | 0 | 1 |
| 375 | 70.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 74.0 | 41.0 | 0.50 | 143.0 | 4.5 | 15.10 | 48.0 | 9700.0 | 5.6 | 0 | 0 | 0 | 0 | 0 | 0 |
| 110 | 63.0 | 90.0 | 1.015 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 123.0 | 19.0 | 2.00 | 142.0 | 3.8 | 11.70 | 34.0 | 11400.0 | 4.7 | 0 | 0 | 0 | 0 | 0 | 0 |
| 149 | 65.0 | 70.0 | 1.020 | 1.0 | 0.0 | 0 | 0 | 0 | 0 | 139.0 | 29.0 | 1.00 | 138.0 | 4.4 | 10.50 | 32.0 | 8000.0 | 4.8 | 1 | 0 | 0 | 0 | 1 | 0 |
| 157 | 62.0 | 70.0 | 1.025 | 3.0 | 0.0 | 1 | 0 | 0 | 0 | 122.0 | 42.0 | 1.70 | 136.0 | 4.7 | 12.60 | 39.0 | 7900.0 | 3.9 | 1 | 1 | 0 | 0 | 0 | 0 |
| 152 | 39.0 | 70.0 | 1.010 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 121.0 | 20.0 | 0.80 | 133.0 | 3.5 | 10.90 | 32.0 | 8000.0 | 4.8 | 0 | 1 | 0 | 0 | 0 | 0 |
| 16 | 47.0 | 70.0 | 1.015 | 2.0 | 0.0 | 1 | 1 | 0 | 0 | 99.0 | 46.0 | 2.20 | 138.0 | 4.1 | 12.60 | 40.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 269 | 25.0 | 80.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 121.0 | 19.0 | 1.20 | 142.0 | 4.9 | 15.00 | 48.0 | 6900.0 | 5.3 | 0 | 0 | 0 | 0 | 0 | 0 |
| 75 | 5.0 | 80.0 | 1.015 | 1.0 | 0.0 | 1 | 1 | 0 | 0 | 121.0 | 16.0 | 0.70 | 138.0 | 3.2 | 8.10 | 40.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 1 |
| 109 | 54.0 | 70.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 233.0 | 50.1 | 1.90 | 138.0 | 4.4 | 11.70 | 40.0 | 8000.0 | 4.8 | 0 | 1 | 0 | 0 | 0 | 0 |
| 327 | 30.0 | 60.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 120.0 | 31.0 | 0.80 | 150.0 | 4.6 | 13.40 | 44.0 | 10700.0 | 5.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 205 | 61.0 | 70.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 100.0 | 28.0 | 2.10 | 138.0 | 4.4 | 12.60 | 43.0 | 8000.0 | 4.8 | 1 | 1 | 0 | 0 | 0 | 0 |
| 315 | 44.0 | 70.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 121.0 | 42.0 | 1.30 | 138.0 | 4.4 | 13.80 | 48.0 | 7800.0 | 4.4 | 0 | 0 | 0 | 0 | 0 | 0 |
| 139 | 41.0 | 70.0 | 1.015 | 2.0 | 0.0 | 1 | 0 | 0 | 1 | 121.0 | 68.0 | 2.80 | 132.0 | 4.1 | 11.10 | 33.0 | 8000.0 | 4.8 | 1 | 0 | 0 | 0 | 1 | 1 |
| 237 | 80.0 | 70.0 | 1.015 | 2.0 | 2.0 | 1 | 1 | 0 | 0 | 141.0 | 53.0 | 2.20 | 138.0 | 4.4 | 12.70 | 40.0 | 9600.0 | 4.8 | 1 | 1 | 0 | 1 | 1 | 0 |
| 319 | 30.0 | 60.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 138.0 | 15.0 | 1.10 | 135.0 | 4.4 | 12.65 | 40.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 248 | 59.0 | 70.0 | 1.010 | 1.0 | 3.0 | 0 | 0 | 0 | 0 | 424.0 | 55.0 | 1.70 | 138.0 | 4.5 | 12.60 | 37.0 | 10200.0 | 4.1 | 1 | 1 | 1 | 0 | 0 | 0 |
| 308 | 43.0 | 80.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 81.0 | 46.0 | 0.60 | 135.0 | 4.9 | 13.90 | 48.0 | 6900.0 | 4.9 | 0 | 0 | 0 | 0 | 0 | 0 |
| 19 | 62.0 | 60.0 | 1.015 | 1.0 | 0.0 | 1 | 0 | 1 | 0 | 100.0 | 31.0 | 1.60 | 138.0 | 4.4 | 10.30 | 30.0 | 5300.0 | 3.7 | 1 | 0 | 1 | 0 | 0 | 0 |
| 226 | 64.0 | 100.0 | 1.015 | 4.0 | 2.0 | 0 | 0 | 0 | 1 | 163.0 | 54.0 | 7.20 | 140.0 | 4.6 | 7.90 | 26.0 | 7500.0 | 3.4 | 1 | 1 | 0 | 0 | 1 | 0 |
| 306 | 52.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 128.0 | 30.0 | 1.20 | 140.0 | 4.5 | 15.20 | 52.0 | 4300.0 | 5.7 | 0 | 0 | 0 | 0 | 0 | 0 |
| 3 | 48.0 | 70.0 | 1.005 | 4.0 | 0.0 | 1 | 0 | 1 | 0 | 117.0 | 56.0 | 3.80 | 111.0 | 2.5 | 11.20 | 32.0 | 6700.0 | 3.9 | 1 | 0 | 0 | 1 | 1 | 1 |
| 276 | 20.0 | 60.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 121.0 | 42.0 | 1.30 | 137.0 | 4.7 | 14.00 | 41.0 | 4500.0 | 5.5 | 0 | 0 | 0 | 0 | 0 | 0 |
| 125 | 72.0 | 90.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 308.0 | 36.0 | 2.50 | 131.0 | 4.3 | 12.65 | 40.0 | 8000.0 | 4.8 | 1 | 1 | 0 | 1 | 0 | 0 |
| 77 | 67.0 | 70.0 | 1.010 | 1.0 | 0.0 | 1 | 1 | 0 | 0 | 102.0 | 48.0 | 3.20 | 137.0 | 5.0 | 11.90 | 34.0 | 7100.0 | 3.7 | 1 | 1 | 0 | 0 | 1 | 0 |
| 184 | 54.0 | 60.0 | 1.015 | 3.0 | 2.0 | 1 | 0 | 0 | 0 | 352.0 | 137.0 | 3.30 | 133.0 | 4.5 | 11.30 | 31.0 | 5800.0 | 3.6 | 1 | 1 | 1 | 1 | 1 | 0 |
| 301 | 44.0 | 60.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 96.0 | 33.0 | 0.90 | 147.0 | 4.5 | 16.90 | 41.0 | 7200.0 | 5.0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 379 | 62.0 | 80.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 78.0 | 45.0 | 0.60 | 138.0 | 3.5 | 16.10 | 50.0 | 5400.0 | 5.7 | 0 | 0 | 0 | 0 | 0 | 0 |
| 346 | 33.0 | 60.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 130.0 | 41.0 | 0.90 | 141.0 | 4.4 | 15.50 | 52.0 | 4300.0 | 5.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 182 | 61.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 131.0 | 23.0 | 0.80 | 140.0 | 4.1 | 11.30 | 35.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 356 | 34.0 | 70.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 87.0 | 38.0 | 0.50 | 144.0 | 4.8 | 17.10 | 47.0 | 7400.0 | 6.1 | 0 | 0 | 0 | 0 | 0 | 0 |
| 80 | 74.0 | 80.0 | 1.010 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 132.0 | 98.0 | 2.80 | 133.0 | 5.0 | 10.80 | 31.0 | 9400.0 | 3.8 | 1 | 1 | 0 | 0 | 0 | 0 |
| 258 | 42.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 98.0 | 20.0 | 0.50 | 140.0 | 3.5 | 13.90 | 44.0 | 8400.0 | 5.5 | 0 | 0 | 0 | 0 | 0 | 0 |
| 11 | 63.0 | 70.0 | 1.010 | 3.0 | 0.0 | 0 | 0 | 1 | 0 | 380.0 | 60.0 | 2.70 | 131.0 | 4.2 | 10.80 | 32.0 | 4500.0 | 3.8 | 1 | 1 | 0 | 1 | 1 | 0 |
| 298 | 34.0 | 60.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 91.0 | 49.0 | 1.20 | 135.0 | 4.5 | 13.50 | 48.0 | 8600.0 | 4.9 | 0 | 0 | 0 | 0 | 0 | 0 |
| 86 | 56.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 415.0 | 37.0 | 1.90 | 138.0 | 4.4 | 12.65 | 40.0 | 8000.0 | 4.8 | 0 | 1 | 0 | 0 | 0 | 0 |
| 266 | 55.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 133.0 | 17.0 | 1.20 | 135.0 | 4.8 | 13.20 | 41.0 | 6800.0 | 5.3 | 0 | 0 | 0 | 0 | 0 | 0 |
| 36 | 76.0 | 70.0 | 1.015 | 1.0 | 0.0 | 1 | 1 | 0 | 0 | 92.0 | 29.0 | 1.80 | 133.0 | 3.9 | 10.30 | 32.0 | 8000.0 | 4.8 | 1 | 0 | 0 | 0 | 0 | 0 |
| 382 | 48.0 | 80.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 75.0 | 22.0 | 0.80 | 137.0 | 5.0 | 16.80 | 51.0 | 6000.0 | 6.5 | 0 | 0 | 0 | 0 | 0 | 0 |
| 58 | 73.0 | 80.0 | 1.020 | 2.0 | 0.0 | 0 | 0 | 0 | 0 | 253.0 | 142.0 | 4.60 | 138.0 | 5.8 | 10.50 | 33.0 | 7200.0 | 4.3 | 1 | 1 | 1 | 0 | 0 | 0 |
| 41 | 45.0 | 70.0 | 1.010 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 121.0 | 20.0 | 0.70 | 138.0 | 4.4 | 12.65 | 40.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 0 | 1 | 0 |
| 270 | 23.0 | 80.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 111.0 | 34.0 | 1.10 | 145.0 | 4.0 | 14.30 | 41.0 | 7200.0 | 5.0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 50 | 53.0 | 60.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 91.0 | 114.0 | 3.25 | 142.0 | 4.3 | 8.60 | 28.0 | 11000.0 | 3.8 | 1 | 1 | 0 | 1 | 1 | 1 |
| 209 | 19.0 | 70.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 121.0 | 42.0 | 1.30 | 138.0 | 4.4 | 11.50 | 40.0 | 6900.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 317 | 58.0 | 70.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 102.0 | 48.0 | 1.20 | 139.0 | 4.3 | 15.00 | 40.0 | 8100.0 | 4.9 | 0 | 0 | 0 | 0 | 0 | 0 |
| 316 | 35.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 99.0 | 30.0 | 0.50 | 135.0 | 4.9 | 15.40 | 48.0 | 5000.0 | 5.2 | 0 | 0 | 0 | 0 | 0 | 0 |
| 331 | 59.0 | 70.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 130.0 | 39.0 | 0.70 | 147.0 | 4.7 | 13.50 | 46.0 | 6700.0 | 4.5 | 0 | 0 | 0 | 0 | 0 | 0 |
| 123 | 43.0 | 80.0 | 1.015 | 2.0 | 3.0 | 1 | 0 | 1 | 1 | 121.0 | 30.0 | 1.10 | 138.0 | 4.4 | 14.00 | 42.0 | 14900.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 222 | 74.0 | 60.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 108.0 | 68.0 | 1.80 | 138.0 | 4.4 | 12.65 | 40.0 | 8000.0 | 4.8 | 1 | 1 | 0 | 0 | 0 | 0 |
| 62 | 15.0 | 60.0 | 1.020 | 3.0 | 0.0 | 1 | 1 | 0 | 0 | 86.0 | 15.0 | 0.60 | 138.0 | 4.0 | 11.00 | 33.0 | 7700.0 | 3.8 | 1 | 1 | 0 | 0 | 0 | 0 |
| 302 | 29.0 | 70.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 127.0 | 44.0 | 1.20 | 145.0 | 5.0 | 14.80 | 48.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 130 | 50.0 | 90.0 | 1.010 | 2.0 | 0.0 | 1 | 0 | 1 | 1 | 128.0 | 208.0 | 9.20 | 134.0 | 4.8 | 8.20 | 22.0 | 16300.0 | 2.7 | 0 | 0 | 0 | 1 | 1 | 1 |
| 187 | 3.0 | 80.0 | 1.010 | 2.0 | 0.0 | 1 | 1 | 0 | 0 | 121.0 | 22.0 | 0.70 | 138.0 | 4.4 | 10.70 | 34.0 | 12300.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 23 | 21.0 | 70.0 | 1.010 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 121.0 | 42.0 | 1.30 | 138.0 | 4.4 | 12.65 | 40.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 1 | 0 | 1 |
| 43 | 35.0 | 80.0 | 1.010 | 1.0 | 0.0 | 0 | 1 | 0 | 0 | 79.0 | 202.0 | 10.80 | 134.0 | 3.4 | 7.90 | 24.0 | 7900.0 | 3.1 | 0 | 1 | 0 | 0 | 0 | 0 |
| 0 | 48.0 | 80.0 | 1.020 | 1.0 | 0.0 | 1 | 1 | 0 | 0 | 121.0 | 36.0 | 1.20 | 138.0 | 4.4 | 15.40 | 44.0 | 7800.0 | 5.2 | 1 | 1 | 0 | 0 | 0 | 0 |
| 201 | 64.0 | 70.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 113.0 | 94.0 | 7.30 | 137.0 | 4.3 | 7.90 | 21.0 | 8000.0 | 4.8 | 1 | 1 | 1 | 0 | 1 | 1 |
| 339 | 25.0 | 70.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 88.0 | 42.0 | 0.50 | 136.0 | 3.5 | 13.30 | 48.0 | 7000.0 | 4.9 | 0 | 0 | 0 | 0 | 0 | 0 |
| 98 | 50.0 | 140.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 101.0 | 106.0 | 6.50 | 135.0 | 4.3 | 6.20 | 18.0 | 5800.0 | 2.3 | 1 | 1 | 0 | 1 | 0 | 1 |
| 387 | 15.0 | 80.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 93.0 | 17.0 | 0.90 | 136.0 | 3.9 | 16.70 | 50.0 | 6200.0 | 5.2 | 0 | 0 | 0 | 0 | 0 | 0 |
| 178 | 42.0 | 90.0 | 1.020 | 2.0 | 0.0 | 0 | 0 | 1 | 0 | 93.0 | 153.0 | 2.70 | 139.0 | 4.3 | 9.80 | 34.0 | 9800.0 | 4.8 | 0 | 0 | 0 | 1 | 1 | 1 |
| 256 | 60.0 | 80.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 131.0 | 10.0 | 0.50 | 146.0 | 5.0 | 14.50 | 41.0 | 10700.0 | 5.1 | 0 | 0 | 0 | 0 | 0 | 0 |
| 94 | 65.0 | 70.0 | 1.010 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 93.0 | 66.0 | 1.60 | 137.0 | 4.5 | 11.60 | 36.0 | 11900.0 | 3.9 | 0 | 1 | 0 | 0 | 0 | 0 |
| 369 | 75.0 | 70.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 107.0 | 48.0 | 0.80 | 144.0 | 3.5 | 13.60 | 46.0 | 10300.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 95 | 62.0 | 90.0 | 1.015 | 1.0 | 0.0 | 1 | 1 | 0 | 0 | 94.0 | 25.0 | 1.10 | 131.0 | 3.7 | 12.65 | 40.0 | 8000.0 | 4.8 | 1 | 0 | 0 | 0 | 1 | 1 |
| 351 | 29.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 83.0 | 49.0 | 0.90 | 139.0 | 3.3 | 17.50 | 40.0 | 9900.0 | 4.7 | 0 | 0 | 0 | 0 | 0 | 0 |
| 169 | 55.0 | 70.0 | 1.010 | 0.0 | 2.0 | 1 | 1 | 0 | 0 | 220.0 | 68.0 | 2.80 | 138.0 | 4.4 | 8.70 | 27.0 | 8000.0 | 4.8 | 1 | 1 | 0 | 0 | 0 | 1 |
| 69 | 26.0 | 70.0 | 1.015 | 0.0 | 4.0 | 1 | 1 | 0 | 0 | 250.0 | 20.0 | 1.10 | 138.0 | 4.4 | 15.60 | 52.0 | 6900.0 | 6.0 | 0 | 1 | 0 | 0 | 0 | 0 |
| 305 | 41.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 122.0 | 25.0 | 0.80 | 138.0 | 5.0 | 17.10 | 41.0 | 9100.0 | 5.2 | 0 | 0 | 0 | 0 | 0 | 0 |
| 48 | 73.0 | 70.0 | 1.005 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 70.0 | 32.0 | 0.90 | 125.0 | 4.0 | 10.00 | 29.0 | 18900.0 | 3.5 | 1 | 1 | 0 | 0 | 1 | 0 |
| 207 | 50.0 | 70.0 | 1.010 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 230.0 | 50.0 | 2.20 | 138.0 | 4.4 | 12.00 | 41.0 | 10400.0 | 4.6 | 1 | 1 | 0 | 0 | 0 | 0 |
| 279 | 24.0 | 70.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 140.0 | 23.0 | 0.60 | 140.0 | 4.7 | 16.30 | 48.0 | 5800.0 | 5.6 | 0 | 0 | 0 | 0 | 0 | 0 |
| 227 | 57.0 | 80.0 | 1.015 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 120.0 | 48.0 | 1.60 | 138.0 | 4.4 | 11.30 | 36.0 | 7200.0 | 3.8 | 1 | 1 | 0 | 0 | 0 | 0 |
| 148 | 69.0 | 60.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 171.0 | 26.0 | 48.10 | 138.0 | 4.4 | 12.65 | 40.0 | 8000.0 | 4.8 | 1 | 0 | 0 | 1 | 0 | 0 |
| 143 | 41.0 | 80.0 | 1.015 | 1.0 | 4.0 | 0 | 1 | 0 | 0 | 210.0 | 165.0 | 18.00 | 135.0 | 4.7 | 12.65 | 40.0 | 8000.0 | 4.8 | 0 | 1 | 0 | 0 | 0 | 0 |
| 180 | 73.0 | 90.0 | 1.010 | 1.0 | 4.0 | 0 | 0 | 1 | 0 | 234.0 | 56.0 | 1.90 | 138.0 | 4.4 | 10.30 | 28.0 | 8000.0 | 4.8 | 0 | 1 | 0 | 0 | 0 | 0 |
| 131 | 5.0 | 50.0 | 1.010 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 121.0 | 25.0 | 0.60 | 138.0 | 4.4 | 11.80 | 36.0 | 12400.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 357 | 66.0 | 70.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 107.0 | 16.0 | 1.10 | 140.0 | 3.6 | 13.60 | 42.0 | 11000.0 | 4.9 | 0 | 0 | 0 | 0 | 0 | 0 |
| 398 | 17.0 | 60.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 114.0 | 50.0 | 1.00 | 135.0 | 4.9 | 14.20 | 51.0 | 7200.0 | 5.9 | 0 | 0 | 0 | 0 | 0 | 0 |
| 262 | 55.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 118.0 | 18.0 | 0.90 | 135.0 | 3.6 | 15.50 | 43.0 | 7200.0 | 5.4 | 0 | 0 | 0 | 0 | 0 | 0 |
| 324 | 40.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 119.0 | 15.0 | 0.70 | 150.0 | 4.9 | 12.65 | 40.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 203 | 55.0 | 90.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 207.0 | 80.0 | 6.80 | 142.0 | 5.5 | 8.50 | 40.0 | 8000.0 | 4.8 | 1 | 1 | 0 | 0 | 0 | 1 |
| 84 | 59.0 | 70.0 | 1.010 | 3.0 | 0.0 | 1 | 0 | 0 | 0 | 76.0 | 186.0 | 15.00 | 135.0 | 7.6 | 7.10 | 22.0 | 3800.0 | 2.1 | 1 | 0 | 0 | 1 | 1 | 1 |
| 121 | 54.0 | 60.0 | 1.020 | 3.0 | 0.0 | 1 | 1 | 0 | 0 | 125.0 | 21.0 | 1.30 | 137.0 | 3.4 | 15.00 | 46.0 | 8000.0 | 4.8 | 1 | 1 | 0 | 0 | 1 | 0 |
| 345 | 22.0 | 60.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 97.0 | 18.0 | 1.20 | 138.0 | 4.3 | 13.50 | 42.0 | 7900.0 | 6.4 | 0 | 0 | 0 | 0 | 0 | 0 |
| 366 | 60.0 | 80.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 81.0 | 15.0 | 0.50 | 141.0 | 3.6 | 15.00 | 46.0 | 10500.0 | 5.3 | 0 | 0 | 0 | 0 | 0 | 0 |
| 91 | 56.0 | 70.0 | 1.015 | 4.0 | 1.0 | 0 | 1 | 0 | 0 | 210.0 | 26.0 | 1.70 | 136.0 | 3.8 | 16.10 | 52.0 | 12500.0 | 5.6 | 0 | 0 | 0 | 0 | 0 | 0 |
| 82 | 38.0 | 70.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 104.0 | 77.0 | 1.90 | 140.0 | 3.9 | 12.65 | 40.0 | 8000.0 | 4.8 | 1 | 0 | 0 | 1 | 1 | 0 |
| 267 | 48.0 | 80.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 122.0 | 33.0 | 0.90 | 146.0 | 3.9 | 13.90 | 48.0 | 9500.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 119 | 60.0 | 70.0 | 1.010 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 140.0 | 27.0 | 1.20 | 138.0 | 4.4 | 12.65 | 40.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 291 | 47.0 | 80.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 124.0 | 44.0 | 1.00 | 140.0 | 4.9 | 14.90 | 41.0 | 7000.0 | 5.7 | 0 | 0 | 0 | 0 | 0 | 0 |
| 57 | 76.0 | 90.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 93.0 | 155.0 | 7.30 | 132.0 | 4.9 | 12.65 | 40.0 | 8000.0 | 4.8 | 1 | 1 | 1 | 1 | 0 | 0 |
| 321 | 65.0 | 60.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 109.0 | 39.0 | 1.00 | 144.0 | 3.5 | 13.90 | 48.0 | 9600.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 257 | 38.0 | 60.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 91.0 | 36.0 | 0.70 | 135.0 | 3.7 | 14.00 | 46.0 | 9100.0 | 5.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 355 | 23.0 | 60.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 95.0 | 24.0 | 0.80 | 145.0 | 5.0 | 15.00 | 52.0 | 6300.0 | 4.6 | 0 | 0 | 0 | 0 | 0 | 0 |
| 42 | 47.0 | 100.0 | 1.010 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 204.0 | 29.0 | 1.00 | 139.0 | 4.2 | 9.70 | 33.0 | 9200.0 | 4.5 | 1 | 0 | 0 | 0 | 0 | 1 |
| 105 | 65.0 | 80.0 | 1.015 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 115.0 | 32.0 | 11.50 | 139.0 | 4.0 | 14.10 | 42.0 | 6800.0 | 5.2 | 0 | 0 | 0 | 0 | 0 | 0 |
| 368 | 30.0 | 80.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 82.0 | 42.0 | 0.70 | 146.0 | 5.0 | 14.90 | 45.0 | 9400.0 | 5.9 | 0 | 0 | 0 | 0 | 0 | 0 |
| 273 | 47.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 95.0 | 35.0 | 0.90 | 140.0 | 4.1 | 12.65 | 40.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 353 | 39.0 | 60.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 86.0 | 37.0 | 0.60 | 150.0 | 5.0 | 13.60 | 51.0 | 5800.0 | 4.5 | 0 | 0 | 0 | 0 | 0 | 0 |
| 38 | 69.0 | 80.0 | 1.020 | 3.0 | 0.0 | 0 | 1 | 0 | 0 | 121.0 | 103.0 | 4.10 | 132.0 | 5.9 | 12.50 | 40.0 | 8000.0 | 4.8 | 1 | 0 | 0 | 0 | 0 | 0 |
| 53 | 62.0 | 80.0 | 1.015 | 0.0 | 5.0 | 1 | 1 | 0 | 0 | 246.0 | 24.0 | 1.00 | 138.0 | 4.4 | 13.60 | 40.0 | 8500.0 | 4.7 | 1 | 1 | 0 | 0 | 0 | 0 |
| 347 | 43.0 | 60.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 108.0 | 25.0 | 1.00 | 144.0 | 5.0 | 17.80 | 43.0 | 7200.0 | 5.5 | 0 | 0 | 0 | 0 | 0 | 0 |
| 128 | 52.0 | 90.0 | 1.015 | 4.0 | 3.0 | 1 | 0 | 0 | 0 | 224.0 | 166.0 | 5.60 | 133.0 | 47.0 | 8.10 | 23.0 | 5000.0 | 2.9 | 1 | 1 | 0 | 0 | 0 | 1 |
| 290 | 54.0 | 70.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 76.0 | 28.0 | 0.60 | 146.0 | 3.5 | 14.80 | 52.0 | 8400.0 | 5.9 | 0 | 0 | 0 | 0 | 0 | 0 |
| 28 | 75.0 | 70.0 | 1.020 | 1.0 | 3.0 | 1 | 1 | 0 | 0 | 123.0 | 31.0 | 1.40 | 138.0 | 4.4 | 12.65 | 40.0 | 8000.0 | 4.8 | 0 | 1 | 0 | 0 | 0 | 0 |
| 183 | 30.0 | 70.0 | 1.015 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 101.0 | 106.0 | 6.50 | 135.0 | 4.3 | 12.65 | 40.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 1 | 0 | 0 |
| 163 | 46.0 | 80.0 | 1.010 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 160.0 | 40.0 | 2.00 | 140.0 | 4.1 | 9.00 | 27.0 | 8100.0 | 3.2 | 1 | 0 | 0 | 1 | 0 | 1 |
| 151 | 76.0 | 90.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 172.0 | 46.0 | 1.70 | 141.0 | 5.5 | 9.60 | 30.0 | 8000.0 | 4.8 | 1 | 1 | 0 | 0 | 0 | 1 |
| 244 | 64.0 | 90.0 | 1.015 | 3.0 | 2.0 | 1 | 0 | 1 | 0 | 463.0 | 64.0 | 2.80 | 135.0 | 4.1 | 12.20 | 40.0 | 9800.0 | 4.6 | 1 | 1 | 0 | 0 | 0 | 1 |
| 202 | 78.0 | 60.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 114.0 | 74.0 | 2.90 | 135.0 | 5.9 | 8.00 | 24.0 | 8000.0 | 4.8 | 0 | 1 | 0 | 0 | 0 | 1 |
| 31 | 73.0 | 90.0 | 1.015 | 3.0 | 0.0 | 1 | 0 | 1 | 0 | 107.0 | 33.0 | 1.50 | 141.0 | 4.6 | 10.10 | 30.0 | 7800.0 | 4.0 | 0 | 0 | 0 | 1 | 0 | 0 |
| 32 | 61.0 | 90.0 | 1.010 | 1.0 | 1.0 | 1 | 1 | 0 | 0 | 159.0 | 39.0 | 1.50 | 133.0 | 4.9 | 11.30 | 34.0 | 9600.0 | 4.0 | 1 | 1 | 0 | 1 | 0 | 0 |
| 127 | 71.0 | 60.0 | 1.015 | 4.0 | 0.0 | 1 | 1 | 0 | 0 | 118.0 | 125.0 | 5.30 | 136.0 | 4.9 | 11.40 | 35.0 | 15200.0 | 4.3 | 1 | 1 | 0 | 1 | 1 | 0 |
| 185 | 4.0 | 80.0 | 1.020 | 1.0 | 0.0 | 1 | 1 | 0 | 0 | 99.0 | 23.0 | 0.60 | 138.0 | 4.4 | 12.00 | 34.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 372 | 72.0 | 60.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 109.0 | 26.0 | 0.90 | 150.0 | 4.9 | 15.00 | 52.0 | 10500.0 | 5.5 | 0 | 0 | 0 | 0 | 0 | 0 |
| 288 | 56.0 | 70.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 70.0 | 46.0 | 1.20 | 135.0 | 4.9 | 15.90 | 50.0 | 11000.0 | 5.1 | 0 | 0 | 0 | 0 | 0 | 0 |
| 362 | 33.0 | 80.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 89.0 | 19.0 | 1.10 | 144.0 | 5.0 | 15.00 | 40.0 | 10300.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 147 | 60.0 | 60.0 | 1.010 | 3.0 | 1.0 | 1 | 0 | 1 | 0 | 288.0 | 36.0 | 1.70 | 130.0 | 3.0 | 7.90 | 25.0 | 15200.0 | 3.0 | 1 | 0 | 0 | 1 | 0 | 1 |
| 285 | 66.0 | 70.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 94.0 | 19.0 | 0.70 | 135.0 | 3.9 | 16.00 | 41.0 | 5300.0 | 5.9 | 0 | 0 | 0 | 0 | 0 | 0 |
| 370 | 69.0 | 70.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 83.0 | 42.0 | 1.20 | 139.0 | 3.7 | 16.20 | 50.0 | 9300.0 | 5.4 | 0 | 0 | 0 | 0 | 0 | 0 |
| 177 | 65.0 | 80.0 | 1.015 | 2.0 | 1.0 | 1 | 1 | 1 | 0 | 215.0 | 133.0 | 2.50 | 138.0 | 4.4 | 13.20 | 41.0 | 8000.0 | 4.8 | 0 | 1 | 0 | 0 | 0 | 0 |
| 99 | 56.0 | 180.0 | 1.020 | 0.0 | 4.0 | 1 | 0 | 0 | 0 | 298.0 | 24.0 | 1.20 | 139.0 | 3.9 | 11.20 | 32.0 | 10400.0 | 4.2 | 1 | 1 | 0 | 1 | 1 | 0 |
| 338 | 62.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 132.0 | 34.0 | 0.80 | 147.0 | 3.5 | 17.80 | 44.0 | 4700.0 | 4.5 | 0 | 0 | 0 | 0 | 0 | 0 |
| 335 | 60.0 | 60.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 134.0 | 45.0 | 0.50 | 139.0 | 4.8 | 14.20 | 48.0 | 10700.0 | 5.6 | 0 | 0 | 0 | 0 | 0 | 0 |
| 197 | 57.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 121.0 | 111.0 | 9.30 | 124.0 | 5.3 | 6.80 | 40.0 | 4300.0 | 3.0 | 1 | 1 | 0 | 0 | 0 | 1 |
| 243 | 62.0 | 90.0 | 1.020 | 2.0 | 1.0 | 1 | 1 | 0 | 0 | 169.0 | 48.0 | 2.40 | 138.0 | 2.9 | 13.40 | 47.0 | 11000.0 | 6.1 | 1 | 0 | 0 | 0 | 0 | 0 |
| 115 | 47.0 | 80.0 | 1.010 | 0.0 | 0.0 | 1 | 0 | 0 | 0 | 121.0 | 28.0 | 0.90 | 138.0 | 4.4 | 12.40 | 44.0 | 5600.0 | 4.3 | 0 | 0 | 0 | 0 | 0 | 1 |
| 265 | 50.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 97.0 | 40.0 | 0.60 | 150.0 | 4.5 | 14.20 | 48.0 | 10500.0 | 5.0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 72 | 64.0 | 90.0 | 1.010 | 3.0 | 3.0 | 1 | 0 | 1 | 0 | 121.0 | 35.0 | 1.30 | 138.0 | 4.4 | 10.30 | 40.0 | 8000.0 | 4.8 | 1 | 1 | 0 | 0 | 1 | 0 |
| 333 | 23.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 99.0 | 46.0 | 1.20 | 142.0 | 4.0 | 17.70 | 46.0 | 4300.0 | 5.5 | 0 | 0 | 0 | 0 | 0 | 0 |
| 25 | 61.0 | 60.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 108.0 | 75.0 | 1.90 | 141.0 | 5.2 | 9.90 | 29.0 | 8400.0 | 3.7 | 1 | 1 | 0 | 0 | 0 | 1 |
| 165 | 60.0 | 80.0 | 1.020 | 0.0 | 2.0 | 1 | 1 | 0 | 0 | 121.0 | 42.0 | 1.30 | 138.0 | 4.4 | 12.65 | 40.0 | 8000.0 | 4.8 | 0 | 1 | 0 | 0 | 0 | 0 |
| 337 | 44.0 | 70.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 92.0 | 40.0 | 0.90 | 141.0 | 4.9 | 14.00 | 52.0 | 7500.0 | 6.2 | 0 | 0 | 0 | 0 | 0 | 0 |
| 384 | 57.0 | 60.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 132.0 | 18.0 | 1.10 | 150.0 | 4.7 | 15.40 | 42.0 | 11000.0 | 4.5 | 0 | 0 | 0 | 0 | 0 | 0 |
| 174 | 54.0 | 70.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 111.0 | 146.0 | 7.50 | 141.0 | 4.7 | 11.00 | 35.0 | 8600.0 | 4.6 | 0 | 0 | 0 | 0 | 0 | 0 |
| 386 | 46.0 | 70.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 100.0 | 47.0 | 0.50 | 142.0 | 3.5 | 16.40 | 43.0 | 5700.0 | 6.5 | 0 | 0 | 0 | 0 | 0 | 0 |
| 39 | 82.0 | 80.0 | 1.010 | 2.0 | 2.0 | 1 | 1 | 0 | 0 | 140.0 | 70.0 | 3.40 | 136.0 | 4.2 | 13.00 | 40.0 | 9800.0 | 4.2 | 1 | 1 | 0 | 0 | 0 | 0 |
| 193 | 32.0 | 90.0 | 1.025 | 1.0 | 0.0 | 0 | 0 | 0 | 0 | 121.0 | 223.0 | 18.10 | 113.0 | 6.5 | 5.50 | 15.0 | 2600.0 | 2.8 | 1 | 1 | 0 | 1 | 1 | 1 |
| 314 | 39.0 | 70.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 131.0 | 46.0 | 0.60 | 145.0 | 5.0 | 15.60 | 41.0 | 9400.0 | 4.7 | 0 | 0 | 0 | 0 | 0 | 0 |
| 88 | 58.0 | 110.0 | 1.010 | 4.0 | 0.0 | 1 | 1 | 0 | 0 | 251.0 | 52.0 | 2.20 | 138.0 | 4.4 | 12.65 | 40.0 | 13200.0 | 4.7 | 1 | 1 | 0 | 0 | 0 | 0 |
| 70 | 61.0 | 80.0 | 1.015 | 0.0 | 4.0 | 1 | 1 | 0 | 0 | 360.0 | 19.0 | 0.70 | 137.0 | 4.4 | 15.20 | 44.0 | 8300.0 | 5.2 | 1 | 1 | 0 | 0 | 0 | 0 |
| 87 | 70.0 | 100.0 | 1.005 | 1.0 | 0.0 | 1 | 0 | 1 | 0 | 169.0 | 47.0 | 2.90 | 138.0 | 4.4 | 11.10 | 32.0 | 5800.0 | 5.0 | 1 | 1 | 0 | 1 | 0 | 0 |
| 292 | 30.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 89.0 | 42.0 | 0.50 | 139.0 | 5.0 | 16.70 | 52.0 | 10200.0 | 5.0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 242 | 69.0 | 70.0 | 1.010 | 4.0 | 3.0 | 1 | 0 | 1 | 1 | 214.0 | 96.0 | 6.30 | 120.0 | 3.9 | 9.40 | 28.0 | 11500.0 | 3.3 | 1 | 1 | 1 | 0 | 1 | 1 |
| 277 | 46.0 | 60.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 123.0 | 46.0 | 1.00 | 135.0 | 5.0 | 15.70 | 50.0 | 6300.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 211 | 54.0 | 120.0 | 1.015 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 103.0 | 18.0 | 1.20 | 138.0 | 4.4 | 12.65 | 40.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 9 | 53.0 | 90.0 | 1.020 | 2.0 | 0.0 | 0 | 0 | 1 | 0 | 70.0 | 107.0 | 7.20 | 114.0 | 3.7 | 9.50 | 29.0 | 12100.0 | 3.7 | 1 | 1 | 0 | 1 | 0 | 1 |
| 359 | 74.0 | 60.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 88.0 | 50.0 | 0.60 | 147.0 | 3.7 | 17.20 | 53.0 | 6000.0 | 4.5 | 0 | 0 | 0 | 0 | 0 | 0 |
| 195 | 70.0 | 90.0 | 1.020 | 2.0 | 1.0 | 0 | 0 | 0 | 1 | 184.0 | 98.6 | 3.30 | 138.0 | 3.9 | 5.80 | 40.0 | 8000.0 | 4.8 | 1 | 1 | 1 | 1 | 0 | 0 |
| 251 | 23.0 | 80.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 70.0 | 36.0 | 1.00 | 150.0 | 4.6 | 17.00 | 52.0 | 9800.0 | 5.0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 323 | 43.0 | 80.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 130.0 | 30.0 | 1.10 | 143.0 | 5.0 | 15.90 | 45.0 | 7800.0 | 4.5 | 0 | 0 | 0 | 0 | 0 | 0 |
| 192 | 46.0 | 110.0 | 1.015 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 130.0 | 16.0 | 0.90 | 138.0 | 4.4 | 12.65 | 40.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 117 | 55.0 | 70.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 219.0 | 36.0 | 1.30 | 139.0 | 3.7 | 12.50 | 37.0 | 9800.0 | 4.4 | 0 | 0 | 0 | 0 | 0 | 0 |
| 47 | 11.0 | 80.0 | 1.010 | 3.0 | 0.0 | 1 | 1 | 0 | 0 | 121.0 | 17.0 | 0.80 | 138.0 | 4.4 | 15.00 | 45.0 | 8600.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 172 | 62.0 | 80.0 | 1.010 | 1.0 | 2.0 | 1 | 1 | 0 | 0 | 309.0 | 113.0 | 2.90 | 130.0 | 2.5 | 10.60 | 34.0 | 12800.0 | 4.9 | 0 | 0 | 0 | 0 | 0 | 0 |
# scores for the features
for i in range(len(ordered_feature.scores_)):
print( 'Feature %s: %f' % (no_scale_X_train.columns[i], ordered_feature.scores_[i])) # plot the scores
plt.bar([i for i in range(len(ordered_feature.scores_))], ordered_feature.scores_)
plt.show()
Feature age: 109.358574 Feature blood_pressure: 56.278555 Feature specific_gravity: 0.003129 Feature albumin: 139.820225 Feature sugar: 61.314607 Feature red_blood_cells: 1.920005 Feature pus_cell: 7.393656 Feature pus_cell_clumps: 16.617978 Feature bacteria: 7.449438 Feature blood_glucose_random: 1556.035234 Feature blood_urea: 1289.937495 Feature serum_creatinine: 187.996095 Feature sodium: 13.576238 Feature potassium: 3.067875 Feature haemoglobin: 74.887437 Feature packed_cell_volume: 187.622622 Feature white_blood_cell_count: 4482.392254 Feature red_blood_cell_count: 10.259505 Feature ypertension: 61.314607 Feature diabetes_mellitus: 58.449438 Feature coronary_artery_disease: 12.606742 Feature appetite: 33.808989 Feature pedal_edema: 32.089888 Feature anemia: 21.202247
pca = PCA()
X_train_pca = pca.fit_transform(X_train)
pca.explained_variance_ratio_
array([0.28179657, 0.07930775, 0.07657652, 0.06739605, 0.04853508,
0.0474705 , 0.04396113, 0.04111073, 0.03520233, 0.03418584,
0.03078254, 0.02810963, 0.0260357 , 0.02384717, 0.02213417,
0.01999963, 0.01756435, 0.01554254, 0.01415419, 0.01227293,
0.01067028, 0.00941838, 0.00816795, 0.00575802])
plt.bar(range(1, 25), pca.explained_variance_ratio_, align='center')
plt.step(range(1, 25), np.cumsum(pca.explained_variance_ratio_), where='mid')
plt.ylabel('Explained variance ratio')
plt.xlabel('Principal components')
plt.show()
pca = PCA(n_components=2)
X_train_pca = pca.fit_transform(X_train)
X_test_pca = pca.transform(X_test)
plt.scatter(X_train_pca[:, 0], X_train_pca[:, 1])
plt.xlabel('PC 1')
plt.ylabel('PC 2')
plt.show()
from matplotlib.colors import ListedColormap
def plot_decision_regions(X, y, classifier, test_idx=None, resolution=0.02):
# setup marker generator and color map
markers = ('o', 's', '^', 'v', '<')
colors = ('red', 'blue', 'lightgreen', 'gray', 'cyan')
cmap = ListedColormap(colors[:len(np.unique(y))])
# plot the decision surface
x1_min, x1_max = X[:, 0].min() - 1, X[:, 0].max() + 1
x2_min, x2_max = X[:, 1].min() - 1, X[:, 1].max() + 1
xx1, xx2 = np.meshgrid(np.arange(x1_min, x1_max, resolution),
np.arange(x2_min, x2_max, resolution))
lab = classifier.predict(np.array([xx1.ravel(), xx2.ravel()]).T)
lab = lab.reshape(xx1.shape)
plt.contourf(xx1, xx2, lab, alpha=0.3, cmap=cmap)
plt.xlim(xx1.min(), xx1.max())
plt.ylim(xx2.min(), xx2.max())
# plot class examples
for idx, cl in enumerate(np.unique(y)):
plt.scatter(x=X[y == cl, 0],
y=X[y == cl, 1],
alpha=0.8,
c=colors[idx],
marker=markers[idx],
label=f'Class {cl}',
edgecolor='black')
pca = PCA(n_components=2)
X_train_pca = pca.fit_transform(X_train)
X_test_pca = pca.transform(X_test)
lr = LogisticRegression(multi_class='ovr', random_state=1, solver='lbfgs')
lr = lr.fit(X_train_pca, y_train)
plot_decision_regions(X_train_pca, y_train, classifier=lr)
plt.xlabel('PC 1')
plt.ylabel('PC 2')
plt.legend(loc='lower left')
plt.tight_layout()
# plt.savefig('figures/05_04.png', dpi=300)
plt.show()
plot_decision_regions(X_test_pca, y_test, classifier=lr)
plt.xlabel('PC 1')
plt.ylabel('PC 2')
plt.legend(loc='lower left')
plt.tight_layout()
# plt.savefig('figures/05_05.png', dpi=300)
plt.show()
pca = PCA(n_components=2)
X_train_pca = pca.fit_transform(X_train)
X_test_pca = pca.transform(X_test)
lgbm = LGBMClassifier()
lgbm = lgbm.fit(X_train_pca, y_train)
plot_decision_regions(X_train_pca, y_train, classifier=lgbm)
plt.xlabel('PC 1')
plt.ylabel('PC 2')
plt.legend(loc='lower left')
plt.tight_layout()
# plt.savefig('figures/05_04.png', dpi=300)
plt.show()
plot_decision_regions(X_test_pca, y_test, classifier=lgbm)
plt.xlabel('PC 1')
plt.ylabel('PC 2')
plt.legend(loc='lower left')
plt.tight_layout()
# plt.savefig('figures/05_05.png', dpi=300)
plt.show()
cov_mat = np.cov(X_train.T)
eigen_vals, eigen_vecs = np.linalg.eig(cov_mat)
print('\nEigenvalues \n', eigen_vals)
Eigenvalues [6.3684879 1.79232304 1.73059832 1.52312325 1.09687319 1.07281395 0.99350366 0.92908583 0.7955583 0.77258613 0.69567299 0.13012886 0.63526618 0.58839633 0.53893645 0.50022321 0.45198361 0.18459241 0.21285161 0.24114403 0.39694715 0.27736316 0.35125517 0.319879 ]
tot = sum(eigen_vals)
var_exp = [(i / tot) for i in sorted(eigen_vals, reverse=True)]
cum_var_exp = np.cumsum(var_exp)
plt.bar(range(1, 25), var_exp, align='center',
label='Individual explained variance')
plt.step(range(1, 25), cum_var_exp, where='mid',
label='Cumulative explained variance')
plt.ylabel('Explained variance ratio')
plt.xlabel('Principal component index')
plt.legend(loc='best')
plt.tight_layout()
# plt.savefig('figures/05_02.png', dpi=300)
plt.show()
The first model you make may not be a good one. You need to improve the model.
In majority of the classification problems, the target class is imbalanced. So you need to balance it in order to get best modelling results.
In this section you will:
Imbalanced classes are a common problem in machine learning classification where there are a disproportionate ratio of observations in each class.
Most machine learning algorithms work best when the number of samples in each class are about equal. This is because most algorithms are designed to maximize accuracy and reduce error.
Here, you will upsample the minority class
# Over sample the minority class
ros = RandomOverSampler()
X_ros, y_ros = ros.fit_resample(X, y)
y_ros.value_counts()
0 250 1 250 Name: class, dtype: int64
# Define the function to build model on balanced dataset
def classification_model(X, y):
scaled_X = scale_data(X)
# Split the dataset into the training set and test set
X_train, X_test, y_train, y_test = train_test_split(scaled_X, y, test_size = 0.3, random_state = 0)
# Training the model:
model.fit(X_train, y_train)
# Predict class for test dataset
y_pred = model.predict(X_test)
# Predict probability for test dataset
y_pred_prod = model.predict_proba(X_test)
y_pred_prod = [x[1] for x in y_pred_prod]
# Compute Evaluation Metric
compute_evaluation_metric(model, X_test, y_test, y_pred, y_pred_prod)
return model
# Build model on balanced data and get evaluation metrics
# run balanced evaluation metrics on all models
for name, model in models:
print(name)
classification_model(X_ros, y_ros)
LR
Accuracy Score :
0.98
AUC Score :
0.9994665718349929
Confusion Matrix :
[[71 3]
[ 0 76]]
Classification Report :
precision recall f1-score support
0 1.00 0.96 0.98 74
1 0.96 1.00 0.98 76
accuracy 0.98 150
macro avg 0.98 0.98 0.98 150
weighted avg 0.98 0.98 0.98 150
ROC curve :
LDA
Accuracy Score :
0.9666666666666667
AUC Score :
0.9992887624466572
Confusion Matrix :
[[69 5]
[ 0 76]]
Classification Report :
precision recall f1-score support
0 1.00 0.93 0.97 74
1 0.94 1.00 0.97 76
accuracy 0.97 150
macro avg 0.97 0.97 0.97 150
weighted avg 0.97 0.97 0.97 150
ROC curve :
KNN
Accuracy Score :
0.9533333333333334
AUC Score :
0.986219772403983
Confusion Matrix :
[[67 7]
[ 0 76]]
Classification Report :
precision recall f1-score support
0 1.00 0.91 0.95 74
1 0.92 1.00 0.96 76
accuracy 0.95 150
macro avg 0.96 0.95 0.95 150
weighted avg 0.96 0.95 0.95 150
ROC curve :
CART
Accuracy Score :
0.98
AUC Score :
0.9799075391180655
Confusion Matrix :
[[72 2]
[ 1 75]]
Classification Report :
precision recall f1-score support
0 0.99 0.97 0.98 74
1 0.97 0.99 0.98 76
accuracy 0.98 150
macro avg 0.98 0.98 0.98 150
weighted avg 0.98 0.98 0.98 150
ROC curve :
RFC
Accuracy Score :
1.0
AUC Score :
1.0
Confusion Matrix :
[[74 0]
[ 0 76]]
Classification Report :
precision recall f1-score support
0 1.00 1.00 1.00 74
1 1.00 1.00 1.00 76
accuracy 1.00 150
macro avg 1.00 1.00 1.00 150
weighted avg 1.00 1.00 1.00 150
ROC curve :
XGB
[13:30:34] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior.
Accuracy Score :
1.0
AUC Score :
1.0
Confusion Matrix :
[[74 0]
[ 0 76]]
Classification Report :
precision recall f1-score support
0 1.00 1.00 1.00 74
1 1.00 1.00 1.00 76
accuracy 1.00 150
macro avg 1.00 1.00 1.00 150
weighted avg 1.00 1.00 1.00 150
ROC curve :
NB
Accuracy Score :
0.98
AUC Score :
0.9797297297297297
Confusion Matrix :
[[71 3]
[ 0 76]]
Classification Report :
precision recall f1-score support
0 1.00 0.96 0.98 74
1 0.96 1.00 0.98 76
accuracy 0.98 150
macro avg 0.98 0.98 0.98 150
weighted avg 0.98 0.98 0.98 150
ROC curve :
LGB
Accuracy Score :
0.9933333333333333
AUC Score :
1.0
Confusion Matrix :
[[73 1]
[ 0 76]]
Classification Report :
precision recall f1-score support
0 1.00 0.99 0.99 74
1 0.99 1.00 0.99 76
accuracy 0.99 150
macro avg 0.99 0.99 0.99 150
weighted avg 0.99 0.99 0.99 150
ROC curve :
Hyperparameter is a parameter whose value is set before the learning process begins
Hyperparameter tuning refers to the automatic optimization of the hyper-parameters of a ML model
# Split the dataset into the training set and test set
X_train, X_test, y_train, y_test = train_test_split(X_ros, y_ros, test_size = 0.3, random_state = 0)
# Define the parameters gird for decision tree
param_grid_decision_tree = {'criterion': ['gini', 'entropy'],
'max_depth': [10,15,20,30,40,50],
'min_samples_leaf' : [1,2,5]
}
# Define the parameters gird for random forest
param_grid_random_forest = {'max_depth' : [10,20,40],
'n_estimators' : [100,200,300],
'min_samples_leaf' : [1,2,5]
}
# Define the parameters gird for XGBoost
param_grid_xgb = {'min_child_weight': [1, 5, 10],
'gamma': [0, 1],
'max_depth': [5,10],
'learning_rate' : [0.05,0.1]
}
# Define the parameters gird for LGBM
param_grid_lgbm = {'n_estimators':[100,200],
'num_leaves': [256,128],
'max_depth': [5, 8, 10],
'learning_rate': [0.05, 0.1]
}
def grid_model(X, y):
scaled_X = scale_data(X)
# Split the dataset into the training set and test set
X_train, X_test, y_train, y_test = train_test_split(scaled_X, y, test_size = 0.3, random_state = 0)
# Run grid search for lgbm
model = LGBMClassifier()
param_grid = param_grid_lgbm
grid = GridSearchCV(model, param_grid, cv =10, refit = True, verbose = 3, n_jobs = -1)
# fit the model for grid search
grid.fit(X_train, y_train)
# Predict class for test dataset
y_pred = grid.predict(X_test)
# Predict probability for test dataset
y_pred_prod = grid.predict_proba(X_test)
y_pred_prod = [x[1] for x in y_pred_prod]
print("Y predicted : ",y_pred)
print("Y probability predicted : ",y_pred_prod[:5])
# Compute Evaluation Metric
compute_evaluation_metric(grid, X_test, y_test, y_pred, y_pred_prod)
# save the model to disk
filename = 'final_model.sav'
pickle.dump(grid.best_estimator_, open(filename, 'wb'))
return grid
def grid_model_RFC(X, y):
scaled_X = scale_data(X)
# Split the dataset into the training set and test set
X_train, X_test, y_train, y_test = train_test_split(scaled_X, y, test_size = 0.3, random_state = 0)
# Run grid search for lgbm
model = RandomForestClassifier()
param_grid = param_grid_random_forest
grid = GridSearchCV(model, param_grid, cv = 10, refit = True, verbose = 3, n_jobs = -1)
# fit the model for grid search
grid.fit(X_train, y_train)
# Predict class for test dataset
y_pred = grid.predict(X_test)
# Predict probability for test dataset
y_pred_prod = grid.predict_proba(X_test)
y_pred_prod = [x[1] for x in y_pred_prod]
print("Y predicted : ",y_pred)
print("Y probability predicted : ",y_pred_prod[:5])
# Compute Evaluation Metric
compute_evaluation_metric(grid, X_test, y_test, y_pred, y_pred_prod)
# save the model to disk
filename = 'final_model.sav'
pickle.dump(grid.best_estimator_, open(filename, 'wb'))
return grid
# Run random search for lgbm
model = LGBMClassifier()
param_rdn = param_grid_lgbm
def random_search(X, y):
scaled_X = scale_data(X)
# Split the dataset into the training set and test set
X_train, X_test, y_train, y_test = train_test_split(scaled_X, y, test_size = 0.3, random_state = 0)
# Run random search for lgbm
model = LGBMClassifier()
param_rdn = param_grid_lgbm
random_search=RandomizedSearchCV(model,param_distributions=param_rdn,n_iter=5,scoring='roc_auc',n_jobs=-1,cv=5,verbose=3)
# fit the model for random search
random_search.fit(X_train, y_train)
print(random_search.best_estimator_)
print(random_search.best_params_)
# Predict class for test dataset
y_pred=random_search.predict(X_test)
# Predict probability for test dataset
y_pred_prod = random_search.predict_proba(X_test)
y_pred_prod = [x[1] for x in y_pred_prod]
print("Y predicted : ",y_pred)
print("Y probability predicted : ",y_pred_prod[:5])
# Compute Evaluation Metric
compute_evaluation_metric(random_search, X_test, y_test, y_pred, y_pred_prod)
return random_search
# Run random search for rfc
model = RandomForestClassifier()
param_rdn = param_grid_random_forest
def random_search_RFC(X, y):
scaled_X = scale_data(X)
# Split the dataset into the training set and test set
X_train, X_test, y_train, y_test = train_test_split(scaled_X, y, test_size = 0.3, random_state = 0)
# Run grid search for lgbm
model = RandomForestClassifier()
param_rdn = param_grid_random_forest
random_search=RandomizedSearchCV(model,param_distributions=param_rdn,n_iter=5,scoring='roc_auc',n_jobs=-1,cv=5,verbose=3)
# fit the model for random search
random_search.fit(X_train, y_train)
print(random_search.best_estimator_)
print(random_search.best_params_)
# Predict class for test dataset
y_pred=random_search.predict(X_test)
# Predict probability for test dataset
y_pred_prod = random_search.predict_proba(X_test)
y_pred_prod = [x[1] for x in y_pred_prod]
print("Y predicted : ",y_pred)
print("Y probability predicted : ",y_pred_prod[:5])
# Compute Evaluation Metric
compute_evaluation_metric(random_search, X_test, y_test, y_pred, y_pred_prod)
return random_search
random_search(X_ros, y_ros)
Fitting 5 folds for each of 5 candidates, totalling 25 fits
LGBMClassifier(max_depth=5, num_leaves=128)
{'num_leaves': 128, 'n_estimators': 100, 'max_depth': 5, 'learning_rate': 0.1}
Y predicted : [0 1 1 1 1 0 1 1 0 0 0 1 1 1 0 1 1 1 0 0 0 1 0 0 0 1 0 1 0 0 1 0 0 0 1 1 1
1 1 0 1 0 1 0 0 0 0 1 1 0 1 0 1 0 0 0 1 0 0 1 0 1 1 1 1 0 0 1 1 0 0 0 1 1
0 1 1 0 1 1 1 0 1 0 1 1 1 0 1 0 0 1 1 0 0 1 0 1 0 0 0 1 0 1 0 1 0 0 1 0 1
0 0 0 1 1 1 1 0 0 1 0 0 1 1 0 0 1 1 1 1 0 1 0 1 1 1 1 1 1 0 0 0 0 1 0 1 1
0 0]
Y probability predicted : [1.8295233060767557e-05, 0.9998964453395851, 0.9994070887037961, 0.9999720723882884, 0.9999644255952257]
Accuracy Score :
0.9933333333333333
AUC Score :
1.0
Confusion Matrix :
[[73 1]
[ 0 76]]
Classification Report :
precision recall f1-score support
0 1.00 0.99 0.99 74
1 0.99 1.00 0.99 76
accuracy 0.99 150
macro avg 0.99 0.99 0.99 150
weighted avg 0.99 0.99 0.99 150
ROC curve :
Visualize Confusion Matrix : <sklearn.metrics._plot.confusion_matrix.ConfusionMatrixDisplay object at 0x2ae8d9c40>
RandomizedSearchCV(cv=5, estimator=LGBMClassifier(), n_iter=5, n_jobs=-1,
param_distributions={'learning_rate': [0.05, 0.1],
'max_depth': [5, 8, 10],
'n_estimators': [100, 200],
'num_leaves': [256, 128]},
scoring='roc_auc', verbose=3)
random_search_RFC(X_ros, y_ros)
Fitting 5 folds for each of 5 candidates, totalling 25 fits
RandomForestClassifier(max_depth=40, n_estimators=300)
{'n_estimators': 300, 'min_samples_leaf': 1, 'max_depth': 40}
Y predicted : [0 1 1 1 1 0 1 1 0 0 0 1 1 1 0 1 1 1 0 0 0 1 0 0 0 1 0 1 0 0 1 0 0 0 1 1 1
1 1 0 1 0 1 0 0 0 0 1 1 0 1 0 1 0 0 0 1 0 0 1 0 1 1 1 1 0 0 1 1 0 0 0 1 1
0 1 1 0 1 1 1 0 1 0 1 1 1 0 1 0 0 1 1 0 0 1 0 1 0 0 0 1 0 1 0 1 0 0 1 0 1
0 0 0 1 1 1 1 0 0 1 0 0 1 1 0 0 1 1 1 1 0 1 0 1 1 1 1 1 0 0 0 0 0 1 0 1 1
0 0]
Y probability predicted : [0.02666666666666667, 0.9933333333333333, 0.9, 0.9966666666666667, 0.97]
Accuracy Score :
1.0
AUC Score :
1.0
Confusion Matrix :
[[74 0]
[ 0 76]]
Classification Report :
precision recall f1-score support
0 1.00 1.00 1.00 74
1 1.00 1.00 1.00 76
accuracy 1.00 150
macro avg 1.00 1.00 1.00 150
weighted avg 1.00 1.00 1.00 150
ROC curve :
Visualize Confusion Matrix : <sklearn.metrics._plot.confusion_matrix.ConfusionMatrixDisplay object at 0x29d3e0b50>
RandomizedSearchCV(cv=5, estimator=RandomForestClassifier(), n_iter=5,
n_jobs=-1,
param_distributions={'max_depth': [10, 20, 40],
'min_samples_leaf': [1, 2, 5],
'n_estimators': [100, 200, 300]},
scoring='roc_auc', verbose=3)
grid_model(X_ros, y_ros)
Fitting 10 folds for each of 24 candidates, totalling 240 fits
Y predicted : [0 1 1 1 1 0 1 1 0 0 0 1 1 1 0 1 1 1 0 0 0 1 0 0 0 1 0 1 0 0 1 0 0 0 1 1 1
1 1 0 1 0 1 0 0 0 0 1 1 0 1 0 1 0 0 0 1 0 0 1 0 1 1 1 1 0 0 1 1 0 0 0 1 1
0 1 1 0 1 1 1 0 1 0 1 1 1 0 1 0 0 1 1 0 0 1 0 1 0 0 0 1 0 1 0 1 0 0 1 0 1
0 0 0 1 1 1 1 0 0 1 0 0 1 1 0 0 1 1 1 1 0 1 0 1 1 1 1 1 0 0 0 0 0 1 0 1 1
0 0]
Y probability predicted : [0.002658638396054072, 0.9943807730865569, 0.9539336880566996, 0.9962327143837482, 0.9954365482477768]
Accuracy Score :
1.0
AUC Score :
1.0
Confusion Matrix :
[[74 0]
[ 0 76]]
Classification Report :
precision recall f1-score support
0 1.00 1.00 1.00 74
1 1.00 1.00 1.00 76
accuracy 1.00 150
macro avg 1.00 1.00 1.00 150
weighted avg 1.00 1.00 1.00 150
ROC curve :
Visualize Confusion Matrix : <sklearn.metrics._plot.confusion_matrix.ConfusionMatrixDisplay object at 0x29e2c5460>
GridSearchCV(cv=10, estimator=LGBMClassifier(), n_jobs=-1,
param_grid={'learning_rate': [0.05, 0.1], 'max_depth': [5, 8, 10],
'n_estimators': [100, 200], 'num_leaves': [256, 128]},
verbose=3)
grid_model_RFC(X_ros, y_ros)
Fitting 10 folds for each of 27 candidates, totalling 270 fits
Y predicted : [0 1 1 1 1 0 1 1 0 0 0 1 1 1 0 1 1 1 0 0 0 1 0 0 0 1 0 1 0 0 1 0 0 0 1 1 1
1 1 0 1 0 1 0 0 0 0 1 1 0 1 0 1 0 0 0 1 0 0 1 0 1 1 1 1 0 0 1 1 0 0 0 1 1
0 1 1 0 1 1 1 0 1 0 1 1 1 0 1 0 0 1 1 0 0 1 0 1 0 0 0 1 0 1 0 1 0 0 1 0 1
0 0 0 1 1 1 1 0 0 1 0 0 1 1 0 0 1 1 1 1 0 1 0 1 1 1 1 1 0 0 0 0 0 1 0 1 1
0 0]
Y probability predicted : [0.0, 0.99, 0.89, 0.99, 0.979375]
Accuracy Score :
1.0
AUC Score :
1.0
Confusion Matrix :
[[74 0]
[ 0 76]]
Classification Report :
precision recall f1-score support
0 1.00 1.00 1.00 74
1 1.00 1.00 1.00 76
accuracy 1.00 150
macro avg 1.00 1.00 1.00 150
weighted avg 1.00 1.00 1.00 150
ROC curve :
Visualize Confusion Matrix : <sklearn.metrics._plot.confusion_matrix.ConfusionMatrixDisplay object at 0x2a961e3d0>
GridSearchCV(cv=10, estimator=RandomForestClassifier(), n_jobs=-1,
param_grid={'max_depth': [10, 20, 40],
'min_samples_leaf': [1, 2, 5],
'n_estimators': [100, 200, 300]},
verbose=3)
# save the model to disk
filename = 'final_model.sav'
pickle.dump(grid_model.best_estimator_, open(filename, 'wb'))
# load the model from disk
loaded_model = pickle.load(open(filename, 'rb'))
loaded_model
LGBMClassifier(learning_rate=0.05, max_depth=5, num_leaves=256)
[CV 4/5] END learning_rate=0.1, max_depth=10, n_estimators=100, num_leaves=256;, score=1.000 total time= 0.0s [CV 1/5] END max_depth=10, min_samples_leaf=5, n_estimators=100;, score=0.999 total time= 0.1s [CV 1/5] END max_depth=40, min_samples_leaf=1, n_estimators=300;, score=1.000 total time= 0.2s [CV 1/5] END max_depth=10, min_samples_leaf=2, n_estimators=100;, score=0.999 total time= 0.1s [CV 1/5] END learning_rate=0.05, max_depth=5, n_estimators=100, num_leaves=256;, score=0.986 total time= 0.0s [CV 3/5] END learning_rate=0.05, max_depth=5, n_estimators=100, num_leaves=256;, score=1.000 total time= 0.0s [CV 3/5] END learning_rate=0.05, max_depth=5, n_estimators=100, num_leaves=128;, score=1.000 total time= 0.0s [CV 5/5] END learning_rate=0.05, max_depth=5, n_estimators=200, num_leaves=256;, score=1.000 total time= 0.0s [CV 2/5] END learning_rate=0.05, max_depth=8, n_estimators=100, num_leaves=128;, score=1.000 total time= 0.0s [CV 3/5] END learning_rate=0.05, max_depth=8, n_estimators=100, num_leaves=128;, score=1.000 total time= 0.0s [CV 2/5] END learning_rate=0.05, max_depth=10, n_estimators=100, num_leaves=128;, score=1.000 total time= 0.0s [CV 3/5] END learning_rate=0.05, max_depth=10, n_estimators=100, num_leaves=128;, score=1.000 total time= 0.0s [CV 4/5] END learning_rate=0.05, max_depth=10, n_estimators=200, num_leaves=128;, score=0.986 total time= 0.0s [CV 5/5] END learning_rate=0.05, max_depth=10, n_estimators=200, num_leaves=128;, score=1.000 total time= 0.0s [CV 3/5] END learning_rate=0.1, max_depth=10, n_estimators=100, num_leaves=256;, score=1.000 total time= 0.0s [CV 3/5] END learning_rate=0.1, max_depth=10, n_estimators=100, num_leaves=128;, score=1.000 total time= 0.0s [CV 3/5] END learning_rate=0.1, max_depth=10, n_estimators=200, num_leaves=256;, score=1.000 total time= 0.0s [CV 1/5] END max_depth=10, min_samples_leaf=1, n_estimators=100;, score=0.986 total time= 0.1s [CV 1/5] END max_depth=10, min_samples_leaf=1, n_estimators=300;, score=0.986 total time= 0.2s [CV 1/5] END max_depth=10, min_samples_leaf=5, n_estimators=100;, score=0.971 total time= 0.1s [CV 2/5] END max_depth=10, min_samples_leaf=5, n_estimators=100;, score=1.000 total time= 0.1s [CV 3/5] END max_depth=10, min_samples_leaf=5, n_estimators=300;, score=1.000 total time= 0.2s [CV 4/5] END max_depth=10, min_samples_leaf=5, n_estimators=300;, score=0.943 total time= 0.2s [CV 4/5] END max_depth=20, min_samples_leaf=2, n_estimators=200;, score=0.986 total time= 0.1s [CV 5/5] END max_depth=20, min_samples_leaf=2, n_estimators=200;, score=1.000 total time= 0.1s [CV 3/5] END max_depth=40, min_samples_leaf=1, n_estimators=100;, score=1.000 total time= 0.1s [CV 4/5] END max_depth=40, min_samples_leaf=1, n_estimators=100;, score=0.986 total time= 0.1s [CV 1/5] END max_depth=40, min_samples_leaf=1, n_estimators=300;, score=0.986 total time= 0.2s [CV 2/5] END max_depth=40, min_samples_leaf=1, n_estimators=300;, score=1.000 total time= 0.2s [CV 2/5] END max_depth=40, min_samples_leaf=5, n_estimators=200;, score=1.000 total time= 0.1s [CV 1/5] END learning_rate=0.1, max_depth=10, n_estimators=100, num_leaves=128;, score=1.000 total time= 0.0s [CV 1/5] END learning_rate=0.1, max_depth=5, n_estimators=100, num_leaves=128;, score=1.000 total time= 0.0s [CV 2/5] END learning_rate=0.1, max_depth=5, n_estimators=100, num_leaves=128;, score=1.000 total time= 0.0s [CV 4/5] END learning_rate=0.1, max_depth=5, n_estimators=100, num_leaves=128;, score=1.000 total time= 0.0s [CV 1/5] END learning_rate=0.1, max_depth=8, n_estimators=100, num_leaves=256;, score=1.000 total time= 0.0s [CV 3/5] END learning_rate=0.1, max_depth=8, n_estimators=200, num_leaves=256;, score=1.000 total time= 0.0s [CV 3/5] END max_depth=10, min_samples_leaf=5, n_estimators=100;, score=1.000 total time= 0.1s [CV 5/5] END max_depth=40, min_samples_leaf=1, n_estimators=300;, score=1.000 total time= 0.2s [CV 2/5] END max_depth=10, min_samples_leaf=2, n_estimators=100;, score=1.000 total time= 0.1s [CV 4/5] END learning_rate=0.05, max_depth=8, n_estimators=200, num_leaves=128;, score=0.971 total time= 0.0s [CV 5/5] END learning_rate=0.05, max_depth=8, n_estimators=200, num_leaves=128;, score=1.000 total time= 0.0s [CV 2/5] END learning_rate=0.1, max_depth=5, n_estimators=200, num_leaves=128;, score=1.000 total time= 0.0s [CV 3/5] END learning_rate=0.1, max_depth=5, n_estimators=200, num_leaves=128;, score=1.000 total time= 0.0s [CV 4/5] END learning_rate=0.1, max_depth=5, n_estimators=200, num_leaves=128;, score=0.971 total time= 0.0s [CV 5/5] END learning_rate=0.1, max_depth=5, n_estimators=200, num_leaves=128;, score=1.000 total time= 0.0s [CV 5/5] END learning_rate=0.1, max_depth=10, n_estimators=100, num_leaves=256;, score=1.000 total time= 0.0s [CV 1/5] END learning_rate=0.1, max_depth=10, n_estimators=200, num_leaves=256;, score=0.986 total time= 0.0s [CV 3/5] END learning_rate=0.1, max_depth=10, n_estimators=200, num_leaves=128;, score=1.000 total time= 0.0s [CV 3/5] END max_depth=10, min_samples_leaf=1, n_estimators=200;, score=1.000 total time= 0.1s [CV 3/5] END max_depth=10, min_samples_leaf=2, n_estimators=100;, score=1.000 total time= 0.1s [CV 2/5] END max_depth=10, min_samples_leaf=2, n_estimators=300;, score=1.000 total time= 0.2s [CV 3/5] END max_depth=10, min_samples_leaf=2, n_estimators=300;, score=1.000 total time= 0.2s [CV 4/5] END max_depth=20, min_samples_leaf=1, n_estimators=300;, score=1.000 total time= 0.2s [CV 5/5] END max_depth=20, min_samples_leaf=1, n_estimators=300;, score=1.000 total time= 0.2s [CV 2/5] END max_depth=20, min_samples_leaf=5, n_estimators=300;, score=1.000 total time= 0.2s [CV 3/5] END max_depth=20, min_samples_leaf=5, n_estimators=300;, score=1.000 total time= 0.2s [CV 5/5] END max_depth=40, min_samples_leaf=2, n_estimators=200;, score=1.000 total time= 0.1s [CV 1/5] END max_depth=40, min_samples_leaf=2, n_estimators=300;, score=0.986 total time= 0.2s [CV 2/5] END learning_rate=0.1, max_depth=10, n_estimators=100, num_leaves=128;, score=1.000 total time= 0.0s [CV 5/5] END learning_rate=0.1, max_depth=5, n_estimators=100, num_leaves=128;, score=1.000 total time= 0.0s [CV 2/5] END learning_rate=0.1, max_depth=8, n_estimators=100, num_leaves=256;, score=1.000 total time= 0.0s [CV 2/5] END learning_rate=0.1, max_depth=8, n_estimators=200, num_leaves=256;, score=1.000 total time= 0.0s [CV 4/5] END max_depth=10, min_samples_leaf=2, n_estimators=300;, score=1.000 total time= 0.2s [CV 4/5] END max_depth=10, min_samples_leaf=1, n_estimators=300;, score=1.000 total time= 0.2s [CV 2/5] END learning_rate=0.05, max_depth=5, n_estimators=200, num_leaves=256;, score=1.000 total time= 0.0s [CV 4/5] END learning_rate=0.05, max_depth=5, n_estimators=200, num_leaves=128;, score=0.971 total time= 0.0s [CV 5/5] END learning_rate=0.05, max_depth=8, n_estimators=200, num_leaves=256;, score=1.000 total time= 0.0s [CV 1/5] END learning_rate=0.05, max_depth=8, n_estimators=200, num_leaves=128;, score=0.986 total time= 0.0s [CV 3/5] END learning_rate=0.1, max_depth=5, n_estimators=200, num_leaves=256;, score=1.000 total time= 0.0s [CV 4/5] END learning_rate=0.1, max_depth=5, n_estimators=200, num_leaves=256;, score=0.971 total time= 0.0s [CV 5/5] END learning_rate=0.1, max_depth=5, n_estimators=200, num_leaves=256;, score=1.000 total time= 0.0s [CV 1/5] END learning_rate=0.1, max_depth=5, n_estimators=200, num_leaves=128;, score=0.986 total time= 0.0s [CV 4/5] END learning_rate=0.1, max_depth=10, n_estimators=100, num_leaves=256;, score=0.986 total time= 0.0s [CV 4/5] END learning_rate=0.1, max_depth=10, n_estimators=100, num_leaves=128;, score=0.986 total time= 0.0s [CV 2/5] END learning_rate=0.1, max_depth=10, n_estimators=200, num_leaves=128;, score=1.000 total time= 0.0s [CV 5/5] END max_depth=10, min_samples_leaf=1, n_estimators=100;, score=1.000 total time= 0.1s [CV 3/5] END max_depth=10, min_samples_leaf=1, n_estimators=300;, score=1.000 total time= 0.2s [CV 2/5] END max_depth=10, min_samples_leaf=5, n_estimators=200;, score=1.000 total time= 0.1s [CV 3/5] END max_depth=10, min_samples_leaf=5, n_estimators=200;, score=1.000 total time= 0.1s [CV 1/5] END max_depth=20, min_samples_leaf=1, n_estimators=200;, score=0.986 total time= 0.1s [CV 2/5] END max_depth=20, min_samples_leaf=1, n_estimators=200;, score=1.000 total time= 0.1s [CV 5/5] END max_depth=20, min_samples_leaf=2, n_estimators=300;, score=1.000 total time= 0.2s [CV 1/5] END max_depth=20, min_samples_leaf=5, n_estimators=100;, score=0.971 total time= 0.0s [CV 1/5] END max_depth=40, min_samples_leaf=1, n_estimators=100;, score=0.986 total time= 0.1s [CV 2/5] END max_depth=40, min_samples_leaf=1, n_estimators=100;, score=1.000 total time= 0.1s [CV 2/5] END max_depth=40, min_samples_leaf=1, n_estimators=200;, score=1.000 total time= 0.1s [CV 3/5] END max_depth=40, min_samples_leaf=1, n_estimators=200;, score=1.000 total time= 0.1s [CV 2/5] END max_depth=40, min_samples_leaf=5, n_estimators=100;, score=1.000 total time= 0.1s [CV 5/5] END max_depth=40, min_samples_leaf=5, n_estimators=100;, score=0.971 total time= 0.1s [CV 5/5] END max_depth=40, min_samples_leaf=5, n_estimators=200;, score=0.971 total time= 0.1s [CV 5/5] END learning_rate=0.1, max_depth=10, n_estimators=100, num_leaves=256;, score=1.000 total time= 0.0s [CV 1/5] END max_depth=10, min_samples_leaf=2, n_estimators=300;, score=0.999 total time= 0.2s [CV 1/5] END max_depth=10, min_samples_leaf=1, n_estimators=300;, score=1.000 total time= 0.2s [CV 5/5] END learning_rate=0.05, max_depth=5, n_estimators=100, num_leaves=256;, score=1.000 total time= 0.0s [CV 5/5] END learning_rate=0.05, max_depth=5, n_estimators=100, num_leaves=128;, score=1.000 total time= 0.0s [CV 2/5] END learning_rate=0.05, max_depth=5, n_estimators=200, num_leaves=128;, score=1.000 total time= 0.0s [CV 1/5] END learning_rate=0.05, max_depth=8, n_estimators=200, num_leaves=256;, score=0.986 total time= 0.0s [CV 2/5] END learning_rate=0.05, max_depth=8, n_estimators=200, num_leaves=256;, score=1.000 total time= 0.0s [CV 2/5] END learning_rate=0.05, max_depth=10, n_estimators=200, num_leaves=128;, score=1.000 total time= 0.0s [CV 3/5] END learning_rate=0.05, max_depth=10, n_estimators=200, num_leaves=128;, score=1.000 total time= 0.0s [CV 5/5] END learning_rate=0.1, max_depth=8, n_estimators=100, num_leaves=256;, score=1.000 total time= 0.0s [CV 1/5] END learning_rate=0.1, max_depth=8, n_estimators=100, num_leaves=128;, score=0.986 total time= 0.0s [CV 2/5] END learning_rate=0.1, max_depth=8, n_estimators=100, num_leaves=128;, score=1.000 total time= 0.0s [CV 3/5] END learning_rate=0.1, max_depth=8, n_estimators=100, num_leaves=128;, score=1.000 total time= 0.0s [CV 4/5] END learning_rate=0.1, max_depth=10, n_estimators=200, num_leaves=256;, score=0.986 total time= 0.0s [CV 4/5] END max_depth=10, min_samples_leaf=1, n_estimators=200;, score=1.000 total time= 0.1s [CV 2/5] END max_depth=10, min_samples_leaf=2, n_estimators=100;, score=1.000 total time= 0.1s [CV 3/5] END max_depth=10, min_samples_leaf=2, n_estimators=200;, score=1.000 total time= 0.1s [CV 4/5] END max_depth=10, min_samples_leaf=2, n_estimators=200;, score=0.986 total time= 0.1s [CV 4/5] END max_depth=20, min_samples_leaf=1, n_estimators=100;, score=1.000 total time= 0.1s [CV 5/5] END max_depth=20, min_samples_leaf=1, n_estimators=100;, score=1.000 total time= 0.1s [CV 1/5] END max_depth=20, min_samples_leaf=2, n_estimators=100;, score=0.986 total time= 0.1s [CV 2/5] END max_depth=20, min_samples_leaf=2, n_estimators=100;, score=1.000 total time= 0.1s [CV 2/5] END max_depth=20, min_samples_leaf=2, n_estimators=200;, score=1.000 total time= 0.1s [CV 3/5] END max_depth=20, min_samples_leaf=2, n_estimators=200;, score=1.000 total time= 0.1s [CV 5/5] END max_depth=40, min_samples_leaf=1, n_estimators=100;, score=1.000 total time= 0.1s [CV 1/5] END max_depth=40, min_samples_leaf=1, n_estimators=200;, score=0.986 total time= 0.1s [CV 4/5] END max_depth=40, min_samples_leaf=2, n_estimators=100;, score=1.000 total time= 0.1s [CV 5/5] END max_depth=40, min_samples_leaf=2, n_estimators=100;, score=1.000 total time= 0.1s [CV 2/5] END max_depth=40, min_samples_leaf=2, n_estimators=300;, score=1.000 total time= 0.2s [CV 3/5] END max_depth=40, min_samples_leaf=2, n_estimators=300;, score=1.000 total time= 0.2s [CV 2/5] END learning_rate=0.1, max_depth=10, n_estimators=100, num_leaves=256;, score=1.000 total time= 0.0s [CV 2/5] END max_depth=10, min_samples_leaf=2, n_estimators=300;, score=1.000 total time= 0.2s [CV 2/5] END max_depth=10, min_samples_leaf=1, n_estimators=300;, score=1.000 total time= 0.2s [CV 1/5] END learning_rate=0.05, max_depth=5, n_estimators=100, num_leaves=128;, score=0.986 total time= 0.0s [CV 1/5] END learning_rate=0.05, max_depth=5, n_estimators=200, num_leaves=256;, score=0.986 total time= 0.0s [CV 3/5] END learning_rate=0.05, max_depth=5, n_estimators=200, num_leaves=128;, score=1.000 total time= 0.0s [CV 3/5] END learning_rate=0.05, max_depth=8, n_estimators=200, num_leaves=256;, score=1.000 total time= 0.0s [CV 4/5] END learning_rate=0.05, max_depth=8, n_estimators=200, num_leaves=256;, score=0.971 total time= 0.0s [CV 4/5] END learning_rate=0.1, max_depth=5, n_estimators=100, num_leaves=128;, score=0.986 total time= 0.0s [CV 5/5] END learning_rate=0.1, max_depth=5, n_estimators=100, num_leaves=128;, score=1.000 total time= 0.0s [CV 1/5] END learning_rate=0.1, max_depth=5, n_estimators=200, num_leaves=256;, score=0.986 total time= 0.0s [CV 2/5] END learning_rate=0.1, max_depth=5, n_estimators=200, num_leaves=256;, score=1.000 total time= 0.0s [CV 1/5] END learning_rate=0.1, max_depth=10, n_estimators=100, num_leaves=256;, score=0.986 total time= 0.0s [CV 1/5] END learning_rate=0.1, max_depth=10, n_estimators=100, num_leaves=128;, score=0.986 total time= 0.0s [CV 5/5] END learning_rate=0.1, max_depth=10, n_estimators=100, num_leaves=128;, score=1.000 total time= 0.0s [CV 5/5] END learning_rate=0.1, max_depth=10, n_estimators=200, num_leaves=256;, score=1.000 total time= 0.0s [CV 3/5] END max_depth=10, min_samples_leaf=1, n_estimators=100;, score=1.000 total time= 0.1s [CV 4/5] END max_depth=10, min_samples_leaf=1, n_estimators=300;, score=1.000 total time= 0.2s [CV 4/5] END max_depth=10, min_samples_leaf=5, n_estimators=200;, score=0.957 total time= 0.1s [CV 5/5] END max_depth=10, min_samples_leaf=5, n_estimators=200;, score=0.971 total time= 0.1s [CV 3/5] END max_depth=20, min_samples_leaf=1, n_estimators=200;, score=1.000 total time= 0.1s [CV 4/5] END max_depth=20, min_samples_leaf=1, n_estimators=200;, score=1.000 total time= 0.1s [CV 2/5] END max_depth=20, min_samples_leaf=5, n_estimators=100;, score=1.000 total time= 0.1s [CV 3/5] END max_depth=20, min_samples_leaf=5, n_estimators=100;, score=1.000 total time= 0.1s [CV 3/5] END max_depth=20, min_samples_leaf=5, n_estimators=200;, score=1.000 total time= 0.1s [CV 4/5] END max_depth=20, min_samples_leaf=5, n_estimators=200;, score=0.971 total time= 0.1s [CV 2/5] END max_depth=40, min_samples_leaf=2, n_estimators=100;, score=1.000 total time= 0.1s [CV 3/5] END max_depth=40, min_samples_leaf=2, n_estimators=100;, score=1.000 total time= 0.0s [CV 3/5] END max_depth=40, min_samples_leaf=2, n_estimators=200;, score=1.000 total time= 0.1s [CV 4/5] END max_depth=40, min_samples_leaf=2, n_estimators=200;, score=1.000 total time= 0.1s [CV 1/5] END max_depth=40, min_samples_leaf=5, n_estimators=300;, score=0.971 total time= 0.2s [CV 4/5] END learning_rate=0.1, max_depth=10, n_estimators=100, num_leaves=128;, score=1.000 total time= 0.0s [CV 4/5] END learning_rate=0.1, max_depth=8, n_estimators=100, num_leaves=256;, score=1.000 total time= 0.0s [CV 4/5] END learning_rate=0.1, max_depth=8, n_estimators=200, num_leaves=256;, score=1.000 total time= 0.0s [CV 4/5] END max_depth=10, min_samples_leaf=5, n_estimators=100;, score=0.998 total time= 0.1s [CV 3/5] END max_depth=40, min_samples_leaf=1, n_estimators=300;, score=1.000 total time= 0.2s [CV 5/5] END max_depth=10, min_samples_leaf=2, n_estimators=100;, score=1.000 total time= 0.1s [CV 1/5] END learning_rate=0.05, max_depth=8, n_estimators=100, num_leaves=256;, score=0.986 total time= 0.0s [CV 2/5] END learning_rate=0.05, max_depth=8, n_estimators=100, num_leaves=256;, score=1.000 total time= 0.0s [CV 1/5] END learning_rate=0.05, max_depth=10, n_estimators=100, num_leaves=256;, score=0.986 total time= 0.0s [CV 2/5] END learning_rate=0.05, max_depth=10, n_estimators=100, num_leaves=256;, score=1.000 total time= 0.0s [CV 1/5] END learning_rate=0.05, max_depth=10, n_estimators=200, num_leaves=256;, score=0.986 total time= 0.0s [CV 2/5] END learning_rate=0.05, max_depth=10, n_estimators=200, num_leaves=256;, score=1.000 total time= 0.0s [CV 3/5] END learning_rate=0.1, max_depth=8, n_estimators=200, num_leaves=256;, score=1.000 total time= 0.0s [CV 4/5] END learning_rate=0.1, max_depth=8, n_estimators=200, num_leaves=256;, score=0.971 total time= 0.0s [CV 5/5] END learning_rate=0.1, max_depth=8, n_estimators=200, num_leaves=256;, score=1.000 total time= 0.0s [CV 1/5] END learning_rate=0.1, max_depth=8, n_estimators=200, num_leaves=128;, score=0.986 total time= 0.0s [CV 2/5] END max_depth=10, min_samples_leaf=1, n_estimators=200;, score=1.000 total time= 0.1s [CV 4/5] END max_depth=10, min_samples_leaf=2, n_estimators=100;, score=1.000 total time= 0.1s [CV 5/5] END max_depth=10, min_samples_leaf=2, n_estimators=200;, score=1.000 total time= 0.1s [CV 1/5] END max_depth=10, min_samples_leaf=2, n_estimators=300;, score=0.986 total time= 0.2s [CV 5/5] END max_depth=20, min_samples_leaf=1, n_estimators=200;, score=1.000 total time= 0.1s [CV 1/5] END max_depth=20, min_samples_leaf=1, n_estimators=300;, score=0.986 total time= 0.2s [CV 4/5] END max_depth=20, min_samples_leaf=5, n_estimators=100;, score=0.943 total time= 0.1s [CV 5/5] END max_depth=20, min_samples_leaf=5, n_estimators=100;, score=0.986 total time= 0.1s [CV 4/5] END max_depth=20, min_samples_leaf=5, n_estimators=300;, score=0.957 total time= 0.2s [CV 5/5] END max_depth=20, min_samples_leaf=5, n_estimators=300;, score=0.971 total time= 0.2s [CV 4/5] END max_depth=40, min_samples_leaf=2, n_estimators=300;, score=1.000 total time= 0.2s [CV 5/5] END max_depth=40, min_samples_leaf=2, n_estimators=300;, score=1.000 total time= 0.2s [CV 1/5] END learning_rate=0.1, max_depth=10, n_estimators=100, num_leaves=256;, score=1.000 total time= 0.0s [CV 3/5] END max_depth=10, min_samples_leaf=2, n_estimators=300;, score=1.000 total time= 0.2s [CV 3/5] END max_depth=10, min_samples_leaf=1, n_estimators=300;, score=1.000 total time= 0.2s [CV 4/5] END learning_rate=0.05, max_depth=5, n_estimators=100, num_leaves=256;, score=0.986 total time= 0.0s [CV 4/5] END learning_rate=0.05, max_depth=5, n_estimators=100, num_leaves=128;, score=0.986 total time= 0.0s [CV 1/5] END learning_rate=0.05, max_depth=5, n_estimators=200, num_leaves=128;, score=0.986 total time= 0.0s [CV 4/5] END learning_rate=0.05, max_depth=8, n_estimators=100, num_leaves=128;, score=0.986 total time= 0.0s [CV 5/5] END learning_rate=0.05, max_depth=8, n_estimators=100, num_leaves=128;, score=1.000 total time= 0.0s [CV 5/5] END learning_rate=0.05, max_depth=10, n_estimators=100, num_leaves=256;, score=1.000 total time= 0.0s [CV 1/5] END learning_rate=0.05, max_depth=10, n_estimators=100, num_leaves=128;, score=0.986 total time= 0.0s [CV 5/5] END learning_rate=0.05, max_depth=10, n_estimators=200, num_leaves=256;, score=1.000 total time= 0.0s [CV 1/5] END learning_rate=0.05, max_depth=10, n_estimators=200, num_leaves=128;, score=0.986 total time= 0.0s [CV 4/5] END learning_rate=0.1, max_depth=8, n_estimators=100, num_leaves=128;, score=0.971 total time= 0.0s [CV 5/5] END learning_rate=0.1, max_depth=8, n_estimators=100, num_leaves=128;, score=1.000 total time= 0.0s [CV 1/5] END learning_rate=0.1, max_depth=8, n_estimators=200, num_leaves=256;, score=0.986 total time= 0.0s [CV 2/5] END learning_rate=0.1, max_depth=8, n_estimators=200, num_leaves=256;, score=1.000 total time= 0.0s [CV 5/5] END learning_rate=0.1, max_depth=10, n_estimators=200, num_leaves=128;, score=1.000 total time= 0.0s [CV 2/5] END max_depth=10, min_samples_leaf=1, n_estimators=100;, score=1.000 total time= 0.1s [CV 5/5] END max_depth=10, min_samples_leaf=1, n_estimators=300;, score=1.000 total time= 0.2s [CV 5/5] END max_depth=10, min_samples_leaf=5, n_estimators=100;, score=0.971 total time= 0.1s [CV 1/5] END max_depth=10, min_samples_leaf=5, n_estimators=200;, score=0.971 total time= 0.1s [CV 2/5] END max_depth=20, min_samples_leaf=1, n_estimators=100;, score=1.000 total time= 0.1s [CV 3/5] END max_depth=20, min_samples_leaf=1, n_estimators=100;, score=1.000 total time= 0.1s [CV 2/5] END max_depth=20, min_samples_leaf=1, n_estimators=300;, score=1.000 total time= 0.2s [CV 3/5] END max_depth=20, min_samples_leaf=1, n_estimators=300;, score=1.000 total time= 0.2s [CV 5/5] END max_depth=20, min_samples_leaf=5, n_estimators=200;, score=0.971 total time= 0.1s [CV 1/5] END max_depth=20, min_samples_leaf=5, n_estimators=300;, score=0.971 total time= 0.2s [CV 1/5] END max_depth=40, min_samples_leaf=2, n_estimators=200;, score=0.986 total time= 0.1s [CV 2/5] END max_depth=40, min_samples_leaf=2, n_estimators=200;, score=1.000 total time= 0.1s [CV 2/5] END max_depth=40, min_samples_leaf=5, n_estimators=300;, score=1.000 total time= 0.2s [CV 3/5] END learning_rate=0.1, max_depth=10, n_estimators=100, num_leaves=128;, score=1.000 total time= 0.0s [CV 1/5] END learning_rate=0.1, max_depth=8, n_estimators=200, num_leaves=256;, score=1.000 total time= 0.0s [CV 2/5] END max_depth=10, min_samples_leaf=5, n_estimators=100;, score=1.000 total time= 0.1s [CV 2/5] END max_depth=40, min_samples_leaf=1, n_estimators=300;, score=1.000 total time= 0.2s [CV 4/5] END max_depth=10, min_samples_leaf=2, n_estimators=100;, score=1.000 total time= 0.1s [CV 4/5] END learning_rate=0.05, max_depth=5, n_estimators=200, num_leaves=256;, score=0.971 total time= 0.0s [CV 3/5] END learning_rate=0.05, max_depth=8, n_estimators=100, num_leaves=256;, score=1.000 total time= 0.0s [CV 4/5] END learning_rate=0.05, max_depth=8, n_estimators=100, num_leaves=256;, score=0.986 total time= 0.0s [CV 4/5] END learning_rate=0.05, max_depth=10, n_estimators=100, num_leaves=128;, score=0.986 total time= 0.0s [CV 5/5] END learning_rate=0.05, max_depth=10, n_estimators=100, num_leaves=128;, score=1.000 total time= 0.0s [CV 1/5] END learning_rate=0.1, max_depth=5, n_estimators=100, num_leaves=256;, score=0.986 total time= 0.0s [CV 2/5] END learning_rate=0.1, max_depth=5, n_estimators=100, num_leaves=256;, score=1.000 total time= 0.0s [CV 3/5] END learning_rate=0.1, max_depth=5, n_estimators=100, num_leaves=256;, score=1.000 total time= 0.0s [CV 4/5] END learning_rate=0.1, max_depth=5, n_estimators=100, num_leaves=256;, score=0.986 total time= 0.0s [CV 1/5] END learning_rate=0.1, max_depth=8, n_estimators=100, num_leaves=256;, score=0.986 total time= 0.0s [CV 2/5] END learning_rate=0.1, max_depth=8, n_estimators=100, num_leaves=256;, score=1.000 total time= 0.0s [CV 3/5] END learning_rate=0.1, max_depth=8, n_estimators=100, num_leaves=256;, score=1.000 total time= 0.0s [CV 4/5] END learning_rate=0.1, max_depth=8, n_estimators=100, num_leaves=256;, score=0.971 total time= 0.0s [CV 1/5] END learning_rate=0.1, max_depth=10, n_estimators=200, num_leaves=128;, score=0.986 total time= 0.0s [CV 5/5] END max_depth=10, min_samples_leaf=1, n_estimators=200;, score=1.000 total time= 0.1s [CV 5/5] END max_depth=10, min_samples_leaf=2, n_estimators=100;, score=1.000 total time= 0.1s [CV 4/5] END max_depth=10, min_samples_leaf=2, n_estimators=300;, score=1.000 total time= 0.2s [CV 5/5] END max_depth=10, min_samples_leaf=2, n_estimators=300;, score=1.000 total time= 0.2s [CV 3/5] END max_depth=20, min_samples_leaf=2, n_estimators=100;, score=1.000 total time= 0.1s [CV 4/5] END max_depth=20, min_samples_leaf=2, n_estimators=100;, score=1.000 total time= 0.1s [CV 3/5] END max_depth=20, min_samples_leaf=2, n_estimators=300;, score=1.000 total time= 0.2s [CV 4/5] END max_depth=20, min_samples_leaf=2, n_estimators=300;, score=1.000 total time= 0.2s [CV 5/5] END max_depth=40, min_samples_leaf=1, n_estimators=300;, score=1.000 total time= 0.2s [CV 1/5] END max_depth=40, min_samples_leaf=2, n_estimators=100;, score=0.986 total time= 0.1s [CV 3/5] END max_depth=40, min_samples_leaf=5, n_estimators=100;, score=1.000 total time= 0.1s [CV 1/5] END max_depth=40, min_samples_leaf=5, n_estimators=200;, score=0.971 total time= 0.1s [CV 3/5] END max_depth=40, min_samples_leaf=5, n_estimators=300;, score=1.000 total time= 0.2s [CV 5/5] END learning_rate=0.1, max_depth=10, n_estimators=100, num_leaves=128;, score=1.000 total time= 0.0s [CV 3/5] END learning_rate=0.1, max_depth=5, n_estimators=100, num_leaves=128;, score=1.000 total time= 0.0s [CV 3/5] END learning_rate=0.1, max_depth=8, n_estimators=100, num_leaves=256;, score=1.000 total time= 0.0s [CV 5/5] END learning_rate=0.1, max_depth=8, n_estimators=200, num_leaves=256;, score=1.000 total time= 0.0s [CV 5/5] END max_depth=10, min_samples_leaf=2, n_estimators=300;, score=1.000 total time= 0.2s [CV 5/5] END max_depth=10, min_samples_leaf=1, n_estimators=300;, score=1.000 total time= 0.2s [CV 2/5] END learning_rate=0.05, max_depth=5, n_estimators=100, num_leaves=256;, score=1.000 total time= 0.0s [CV 2/5] END learning_rate=0.05, max_depth=5, n_estimators=100, num_leaves=128;, score=1.000 total time= 0.0s [CV 3/5] END learning_rate=0.05, max_depth=5, n_estimators=200, num_leaves=256;, score=1.000 total time= 0.0s [CV 5/5] END learning_rate=0.05, max_depth=5, n_estimators=200, num_leaves=128;, score=1.000 total time= 0.0s [CV 2/5] END learning_rate=0.05, max_depth=8, n_estimators=200, num_leaves=128;, score=1.000 total time= 0.0s [CV 3/5] END learning_rate=0.05, max_depth=8, n_estimators=200, num_leaves=128;, score=1.000 total time= 0.0s [CV 5/5] END learning_rate=0.1, max_depth=5, n_estimators=100, num_leaves=256;, score=1.000 total time= 0.0s [CV 1/5] END learning_rate=0.1, max_depth=5, n_estimators=100, num_leaves=128;, score=0.986 total time= 0.0s [CV 2/5] END learning_rate=0.1, max_depth=5, n_estimators=100, num_leaves=128;, score=1.000 total time= 0.0s [CV 3/5] END learning_rate=0.1, max_depth=5, n_estimators=100, num_leaves=128;, score=1.000 total time= 0.0s [CV 2/5] END learning_rate=0.1, max_depth=10, n_estimators=100, num_leaves=256;, score=1.000 total time= 0.0s [CV 2/5] END learning_rate=0.1, max_depth=10, n_estimators=100, num_leaves=128;, score=1.000 total time= 0.0s [CV 2/5] END learning_rate=0.1, max_depth=10, n_estimators=200, num_leaves=256;, score=1.000 total time= 0.0s [CV 4/5] END learning_rate=0.1, max_depth=10, n_estimators=200, num_leaves=128;, score=0.986 total time= 0.0s [CV 1/5] END max_depth=10, min_samples_leaf=1, n_estimators=200;, score=0.986 total time= 0.1s [CV 1/5] END max_depth=10, min_samples_leaf=2, n_estimators=100;, score=0.986 total time= 0.1s [CV 1/5] END max_depth=10, min_samples_leaf=2, n_estimators=200;, score=0.986 total time= 0.1s [CV 2/5] END max_depth=10, min_samples_leaf=2, n_estimators=200;, score=1.000 total time= 0.1s [CV 5/5] END max_depth=10, min_samples_leaf=5, n_estimators=300;, score=0.971 total time= 0.2s [CV 1/5] END max_depth=20, min_samples_leaf=1, n_estimators=100;, score=0.986 total time= 0.1s [CV 5/5] END max_depth=20, min_samples_leaf=2, n_estimators=100;, score=1.000 total time= 0.1s [CV 1/5] END max_depth=20, min_samples_leaf=2, n_estimators=200;, score=0.986 total time= 0.1s [CV 1/5] END max_depth=20, min_samples_leaf=5, n_estimators=200;, score=0.971 total time= 0.1s [CV 2/5] END max_depth=20, min_samples_leaf=5, n_estimators=200;, score=1.000 total time= 0.1s [CV 4/5] END max_depth=40, min_samples_leaf=1, n_estimators=200;, score=1.000 total time= 0.1s [CV 5/5] END max_depth=40, min_samples_leaf=1, n_estimators=200;, score=1.000 total time= 0.1s [CV 1/5] END max_depth=40, min_samples_leaf=5, n_estimators=100;, score=0.971 total time= 0.1s [CV 4/5] END max_depth=40, min_samples_leaf=5, n_estimators=100;, score=0.929 total time= 0.1s [CV 4/5] END max_depth=40, min_samples_leaf=5, n_estimators=200;, score=0.957 total time= 0.1s [CV 4/5] END max_depth=40, min_samples_leaf=5, n_estimators=300;, score=0.957 total time= 0.1s [CV 3/5] END learning_rate=0.1, max_depth=10, n_estimators=100, num_leaves=256;, score=1.000 total time= 0.0s [CV 5/5] END learning_rate=0.1, max_depth=8, n_estimators=100, num_leaves=256;, score=1.000 total time= 0.0s [CV 5/5] END max_depth=10, min_samples_leaf=5, n_estimators=100;, score=1.000 total time= 0.1s [CV 4/5] END max_depth=40, min_samples_leaf=1, n_estimators=300;, score=1.000 total time= 0.2s [CV 3/5] END max_depth=10, min_samples_leaf=2, n_estimators=100;, score=1.000 total time= 0.1s [CV 5/5] END learning_rate=0.05, max_depth=8, n_estimators=100, num_leaves=256;, score=1.000 total time= 0.0s [CV 1/5] END learning_rate=0.05, max_depth=8, n_estimators=100, num_leaves=128;, score=0.986 total time= 0.0s [CV 3/5] END learning_rate=0.05, max_depth=10, n_estimators=100, num_leaves=256;, score=1.000 total time= 0.0s [CV 4/5] END learning_rate=0.05, max_depth=10, n_estimators=100, num_leaves=256;, score=0.986 total time= 0.0s [CV 3/5] END learning_rate=0.05, max_depth=10, n_estimators=200, num_leaves=256;, score=1.000 total time= 0.0s [CV 4/5] END learning_rate=0.05, max_depth=10, n_estimators=200, num_leaves=256;, score=0.986 total time= 0.0s [CV 2/5] END learning_rate=0.1, max_depth=8, n_estimators=200, num_leaves=128;, score=1.000 total time= 0.0s [CV 3/5] END learning_rate=0.1, max_depth=8, n_estimators=200, num_leaves=128;, score=1.000 total time= 0.0s [CV 4/5] END learning_rate=0.1, max_depth=8, n_estimators=200, num_leaves=128;, score=0.971 total time= 0.0s [CV 5/5] END learning_rate=0.1, max_depth=8, n_estimators=200, num_leaves=128;, score=1.000 total time= 0.0s [CV 4/5] END max_depth=10, min_samples_leaf=1, n_estimators=100;, score=1.000 total time= 0.1s [CV 2/5] END max_depth=10, min_samples_leaf=1, n_estimators=300;, score=1.000 total time= 0.2s [CV 3/5] END max_depth=10, min_samples_leaf=5, n_estimators=100;, score=1.000 total time= 0.1s [CV 4/5] END max_depth=10, min_samples_leaf=5, n_estimators=100;, score=0.943 total time= 0.1s [CV 1/5] END max_depth=10, min_samples_leaf=5, n_estimators=300;, score=0.971 total time= 0.2s [CV 2/5] END max_depth=10, min_samples_leaf=5, n_estimators=300;, score=1.000 total time= 0.2s [CV 1/5] END max_depth=20, min_samples_leaf=2, n_estimators=300;, score=0.986 total time= 0.2s [CV 2/5] END max_depth=20, min_samples_leaf=2, n_estimators=300;, score=1.000 total time= 0.2s [CV 3/5] END max_depth=40, min_samples_leaf=1, n_estimators=300;, score=1.000 total time= 0.2s [CV 4/5] END max_depth=40, min_samples_leaf=1, n_estimators=300;, score=1.000 total time= 0.2s [CV 3/5] END max_depth=40, min_samples_leaf=5, n_estimators=200;, score=1.000 total time= 0.1s [CV 5/5] END max_depth=40, min_samples_leaf=5, n_estimators=300;, score=0.971 total time= 0.1s